[TRT] ModelImporter.cpp:778: ERROR: builtin_op_importers.cpp:4890 In function importFallbackPluginImporter:

Description

How can I fix this error?

[07/03/2024-19:13:53] [W] [TRT] onnx2trt_utils.cpp:395: One or more weights outside the range of INT32 was clamped
[07/03/2024-19:14:02] [E] [TRT] ModelImporter.cpp:773: While parsing node number 3300 [Mod -> "/hm_decoder/Mod_output_0"]:
[07/03/2024-19:14:02] [E] [TRT] ModelImporter.cpp:774: --- Begin node ---
[07/03/2024-19:14:02] [E] [TRT] ModelImporter.cpp:775: input: "/hm_decoder/TopK_output_1"
input: "/hm_decoder/Mul_1_output_0"
output: "/hm_decoder/Mod_output_0"
name: "/hm_decoder/Mod"
op_type: "Mod"
attribute {
  name: "fmod"
  i: 0
  type: INT
}
 
[07/03/2024-19:14:02] [E] [TRT] ModelImporter.cpp:776: --- End node ---
[07/03/2024-19:14:02] [E] [TRT] ModelImporter.cpp:778: ERROR: builtin_op_importers.cpp:4890 In function importFallbackPluginImporter:
[8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?"

TensorRT 8.4, and CUDA 11.4 on Jetson Xavier NX, Ubuntu 22.04 and JetPack 5.0.2

A clear and concise description of the bug or issue.

Environment

TensorRT Version:
GPU Type:
Nvidia Driver Version:
CUDA Version:
CUDNN Version:
Operating System + Version:
Python Version (if applicable):
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

I made a mistake, my Ubuntu version is 20.04 somehow the forum is not allowing me to edit my post.

Hi @mona.jalal ,
You need to implement the customer plugin for the same.
Please refer to the following doc