Inference error while using tensorrt engine on jetson nano

I am having the following error while running inference on a trt engine
The engine file is for object detection model ‘ssd_mobilenet_v2_fpnlite_320x320_coco17_tpu-8’.

[TensorRT] ERROR: 2: [pluginV2DynamicExtRunner.cpp::execute::115] Error Code 2: Internal Error (Assertion status == kSTATUS_SUCCESS failed.)


TensorRT version:
onnx version: 1.10.2
Jetpack: 4.6

Relevant Files

I have uploaded my onnx and trt engine file for the model, and the script I am using for inference.
And I follow the code given in the link: to convert onnx to trt file.

model.onnx (10.4 MB) (5.2 KB)
model.trt (12.4 MB)

Please have a look.
Thank you.

This looks like a Jetson issue. Please refer to the below samlples in case useful.

For any further assistance, we recommend you to raise it to the respective platform from the below link


1 Like

I’m also having the same issue. any useful solution for this?

You can follow up this question. I opened a new issue there.

Good luck