Hello,
Im having the following error while running inference using TensorRT version of Yolov7 model:
[TensorRT] ERROR: 2: [pluginV2DynamicExtRunner.cpp::execute::115] Error Code 2: Internal Error (Assertion status == kSTATUS_SUCCESS failed.)
Im using TX2 with the following specs:
- Jetpack 4.6
- TensorRT 8.0.1.6
Im running the following script
predict_trt.py (4.4 KB)
If i run the same script on Jetson Nano with the next specs, it runs perfectly:
- Jetpack UNKNOWN (but i flashed 4.6, idk why it show this)
- TensorRT 8.2.1.8
I think that the TensorRT version is causing the trouble, because i had to change the precision from fp16 to fp36 because i was getting this error:
File “export.py”, line 260, in create_engine
- with self.builder.build_serialized_network(self.network, self.config) as engine, open(engine_path, “wb”) as f:*
AttributeError: enter
PD: im creating the .trt file from a .onnx using this repo: GitHub - Linaom1214/TensorRT-For-YOLO-Series: tensorrt for yolo series (YOLOv8, YOLOv7, YOLOv6....), nms plugin support. Based in this yolov7 notebook: Google Colab
I hope you can help me, thanks