Custom plugin supporting int8 I/O type check fail


I’m trying to create a custom plugin which takes one int8 tensor and one int32 tensor as input. It should output one int8 tensor and one int32 tensor. (without calibration)

However, when I try to build the engine, inside supportsFormatCombination, it shows the input and output which supposed to have int8 type having float32. I’m not sure what to do here.

Here is a link to my onnx model: spconv_relu_int8_modified.onnx - Google Drive. Can anyone please check?


TensorRT Version:
GPU Type: 3090
Nvidia Driver Version: 515.105.01
CUDA Version: 11.2
CUDNN Version:
Operating System + Version: ubuntu18.04
Python Version (if applicable):
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered


We tried reproducing the above error, but we are facing a different error, as follows:
Please use the latest TensorRT version, 8.6.1.

[05/17/2023-11:20:27] [V] [TRT] Searching for input: spconv0.channel_scale
[05/17/2023-11:20:27] [V] [TRT] Plugin0 [Plugin] inputs: [input.0 -> (-1, 4)[INT8]], [input.1 -> (1, 1)[INT32]], [spconv0.weight -> (16, 3, 3, 3, 4)[INT8]], [spconv0.scaled_bias -> (16)[FLOAT]], [spconv0.channel_scale -> (16)[FLOAT]],
[05/17/2023-11:20:27] [I] [TRT] No importer registered for op: Plugin. Attempting to import as plugin.
[05/17/2023-11:20:27] [I] [TRT] Searching for plugin: Plugin, plugin_version: 1, plugin_namespace:
[05/17/2023-11:20:27] [V] [TRT] Local registry did not find Plugin creator. Will try parent registry if enabled.
[05/17/2023-11:20:27] [V] [TRT] Global registry did not find Plugin creator. Will try parent registry if enabled.
[05/17/2023-11:20:27] [E] [TRT] 3: getPluginCreator could not find plugin: Plugin version: 1
[05/17/2023-11:20:27] [E] [TRT] ModelImporter.cpp:771: While parsing node number 0 [Plugin -> "output.0"]:
[05/17/2023-11:20:27] [E] [TRT] ModelImporter.cpp:772: --- Begin node ---
[05/17/2023-11:20:27] [E] [TRT] ModelImporter.cpp:773: input: "input.0"
input: "input.1"
input: "spconv0.weight"
input: "spconv0.scaled_bias"
input: "spconv0.channel_scale"
output: "output.0"
output: "output.1"
name: "Plugin0"
op_type: "Plugin"
attribute {
  name: "info"
  s: "{\n    \"ndim\": 3,\n    \"in_channels\": 4,\n    \"out_channels\": 16,\n    \"kernel_size\": [\n        3,\n        3,\n        3\n    ],\n    \"stride\": [\n        1,\n        1,\n        1\n    ],\n    \"dilation\": [\n        1,\n        1,\n        1\n    ],\n    \"padding\": [\n        1,\n        1,\n        1\n    ],\n    \"conv1x1\": false,\n    \"transposed\": false,\n    \"inverse\": false,\n    \"output_padding\": [\n        0,\n        0,\n        0\n    ],\n    \"groups\": 1,\n    \"subm\": true,\n    \"indice_key\": \"subm1\",\n    \"algo\": \"ConvAlgo.MaskImplicitGemm\",\n    \"output_shape\": [\n        151894,\n        16\n    ],\n    \"num_multiply\": 1,\n    \"input_tensor_id\": \"input.1\",\n    \"output_tensor_id\": \"0\",\n    \"output_scale\": 0.07346702367067337,\n    \"act_type\": \"Relu\",\n    \"static_num_act_in\": 210000,\n    \"static_num_act_out\": 210000\n}"
  type: STRING
attribute {
  name: "plugin_name"
  s: "SparseConvReLU"
  type: STRING

[05/17/2023-11:20:27] [E] [TRT] ModelImporter.cpp:774: --- End node ---
[05/17/2023-11:20:27] [E] [TRT] ModelImporter.cpp:777: ERROR: builtin_op_importers.cpp:5402 In function importFallbackPluginImporter:
[8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?"
[05/17/2023-11:20:27] [E] Failed to parse onnx file
[05/17/2023-11:20:27] [I] Finished parsing network model. Parse time: 0.00689514
[05/17/2023-11:20:27] [E] Parsing model failed

The error states that the importer could not find the plugin named “Plugin” with version 1. Could you please make sure the plugin name and functionality are correctly implemented/configured.

Please refer to below links related custom plugin implementation and sample:

While IPluginV2 and IPluginV2Ext interfaces are still supported for backward compatibility with TensorRT 5.1 and 6.0.x respectively, however, we recommend that you write new plugins or refactor existing ones to target the IPluginV2DynamicExt or IPluginV2IOExt interfaces instead.