Error making inference with TensorRT

Hello,
Im having the following error while running inference using TensorRT version of Yolov7 model:

[TensorRT] ERROR: 2: [pluginV2DynamicExtRunner.cpp::execute::115] Error Code 2: Internal Error (Assertion status == kSTATUS_SUCCESS failed.)
Im using TX2 with the following specs:

  • Jetpack 4.6
  • TensorRT 8.0.1.6

Im running the following script

predict_trt.py (4.4 KB)

If i run the same script on Jetson Nano with the next specs, it runs perfectly:

  • Jetpack UNKNOWN (but i flashed 4.6, idk why it show this)
  • TensorRT 8.2.1.8

I think that the TensorRT version is causing the trouble, because i had to change the precision from fp16 to fp36 because i was getting this error:

File “export.py”, line 260, in create_engine

  • with self.builder.build_serialized_network(self.network, self.config) as engine, open(engine_path, “wb”) as f:*
    AttributeError: enter

PD: im creating the .trt file from a .onnx using this repo: GitHub - Linaom1214/TensorRT-For-YOLO-Series: tensorrt for yolo series (YOLOv8, YOLOv7, YOLOv6....), nms plugin support. Based in this yolov7 notebook: Google Colab

I hope you can help me, thanks

Hi,

Do you mean your model can work on TensorRT 8.2?
If yes, could you upgrade TX2 to the latest software?

JetPack 4.6.3 has TensorRT 8.2.1:

Thanks.

Hi, thanks for your answer. And yes, my model works in TensorRT 8.2 using my Jetson Nano.
The thing is that im working with the TX2 remotely, so now im not able to reflash the card.
Is there any way to update JetPack or TensorRT without flashing?

Thanks.

No.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.