How to register custom plugin on TensorRT optimizer

Hi, i have implemented custom plugin for Tile operator, but when i try to generate tensorrt model (my model is using onnx). i got following error

[01-06-2023 23:59:21] (Unnamed Layer* 30) [Slice]: the start input tensors must be specified if any dynamic tensor is specified.
[01-06-2023 23:59:21] ModelImporter.cpp:140: No importer registered for op: Tile. Attempting to import as plugin.
While parsing node number 16 [Tile]:
ERROR: builtin_op_importers.cpp:2193 In function importFallbackPluginImporter:
[8] Assertion failed: creator && “Plugin not found”
[01-06-2023 23:59:21] Releasing Driveworks SDK Context
Error: DW_FILE_INVALID: DNNGenerator: Unable to parse ONNX model.

The .so file is in same folder as tensorrt_optimization tools
and the plugin.json is like this
{
“libdnn_tile_plugin.so” : [“Tile”]
}

what is wrong?

Please provide the following info (check/uncheck the boxes after creating this topic):
Software Version
DRIVE OS Linux 5.2.6
DRIVE OS Linux 5.2.6 and DriveWorks 4.0
DRIVE OS Linux 5.2.0
DRIVE OS Linux 5.2.0 and DriveWorks 3.5
NVIDIA DRIVE™ Software 10.0 (Linux)
NVIDIA DRIVE™ Software 9.0 (Linux)
other DRIVE OS version
other

Target Operating System
Linux
QNX
other

Hardware Platform
NVIDIA DRIVE™ AGX Xavier DevKit (E3550)
NVIDIA DRIVE™ AGX Pegasus DevKit (E3550)
other

SDK Manager Version
1.9.1.10844
other

Host Machine Version
native Ubuntu 18.04
other

I tried
/usr/src/tensorrt/bin/trtexec --explicitBatch --onnx=pfe_baseline32000.onnx --saveEngine=pfe_baseline32000.trt

And shows the same output

[06/03/2023-10:09:37] [E] [TRT] (Unnamed Layer* 30) [Slice]: the start input tensors must be specified if any dynamic tensor is specified.
[06/03/2023-10:09:37] [W] [TRT] ModelImporter.cpp:140: No importer registered for op: Tile. Attempting to import as plugin.
[06/03/2023-10:09:37] [I] [TRT] builtin_op_importers.cpp:2191: Searching for plugin: Tile, plugin_version: 1, plugin_namespace: 
While parsing node number 16 [Tile]:
ERROR: builtin_op_importers.cpp:2193 In function importFallbackPluginImporter:
[8] Assertion failed: creator && "Plugin not found"
[06/03/2023-10:09:37] [E] Failed to parse onnx file
[06/03/2023-10:09:37] [E] Parsing model failed
terminate called after throwing an instance of 'std::runtime_error'
  what():  Failed to create object
Aborted (core dumped)

I think it is expected ya.

Please help how to import the custom plugin on Driveworks, thank you

Hi, any update on this? Need help ASAP thank you!
I just follow what is written here
https://docs.nvidia.com/drive/driveworks-4.0/dwx_dnn_plugins.html#model_generation

Dear @user142014,

Did you make any changes in builtin_op_inporters.cpp?
Could you check if plugin implementation in Part 2: Extending NVIDIA TensorRT with custom layers using CUDA | NVIDIA On-Demand helps ?
Source : GitHub - onnx/onnx-tensorrt at webinar/s3pool

No, i don’t make any changes on builtin_op_importers.cpp
I just follow Driveworks documentation. Im just wondering,there is slight different implementation for Driveworks custom plugin and tensorrt plugin right?
I am able to created plugin and loaded it in tensorRt, but in Driveworks, it doesn’t works.

I know in Driveworks documentation in building custom plugin like poolPlugin dnn example, it doesnt have creator class like in common TensorRT. The creation part in driveworks custom plugin is from this part?

dwStatus _dwDNNPlugin_create(_dwDNNPluginHandle_t* handle)
{
    std::unique_ptr<PoolPlugin> plugin(new PoolPlugin());
    *handle = reinterpret_cast<_dwDNNPluginHandle_t>(plugin.release());
    return DW_SUCCESS;
}

Any idea?

I’ve watched the video, but it is for TensorRT itself, not driveworks4.0
For me tensorRt docs is fine, i can follow along. But for driveworks4.0, i have problem

Hi, any update?
Here’s i attach the onnx file i wanted to tensorrt optimize
pfe_baseline32000.onnx (20.7 KB)

I’ve tried to convert the opset ver to 10 similar with this issue, but not solving the issue

Dear @user142014,
Could you please file a NVBug for this issue with all the details. Please login to NVIDIA DRIVE - Autonomous Vehicle Development Platforms | NVIDIA Developer with your credentials. Please check MyAccount->MyBugs-> Submit a new bug to file bug and share bug ID to follow up via seperate bug.

Dear @user142014,
I see you are able to run model at DNNPlugin: INT32 is not yet supported AND Assertion failed: d.nbDims >= 1 ../rtSafe/safeHelpers.cpp:91 Aborting
Could you share how you fixed below issue in parser?

I just use onnx simplifier to the model (to get shape to be defined) and set the plugin path as full path from root

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.