Libnvinfer plugin is not working on TX2 NX for Yolo V4


I am using Jetpack 4.6. I tried both the prebuild from this repo (deepstream_tao_apps/TRT-OSS/Jetson/TRT8.0 at master · NVIDIA-AI-IOT/deepstream_tao_apps · GitHub) and I also tried building the plugin by myself. Both show me the following error.

[04/25/2022-16:52:55] [E] [TRT] 2: [pluginV2Runner.cpp::execute::267] Error Code 2: Internal Error (Assertion status == kSTATUS_SUCCESS failed.)
&&&& FAILED TensorRT.sample_yolo [TensorRT v8001] # …/bin/yolov4 --fp16

Jetpack 4.6 comes with:
TensorRT 8.0.1
CuDNN 8.2.1
CUDA 10.2

I could build and successfully use the plugin in Nano with the same Jetpack version but not in TX2 NX.

Need help.

I can not reproduce the error. The lib works on our NX board.

Which BSP have you used? I am testing it on CTI Photon ( with both JP 4.6 ( and JP 4.6.1 ( I am guessing now if it is a BSP issue. Any ideas how I can validate that?

I tried to load nvidia’s BSP with SDK manager but it is failing.

SDKmanager is the official way to install Jetson images. Please refer to JetPack SDK 5.0 Developer Preview | NVIDIA Developer

SDK Manager is not being able to image the TX2 NX. Not sure if it is because of the CTI photon board used as the platform for the TX2 NX. May I ask you which board/platform you used for your TX2 NX module?

So please ask your board vendor for the correct image.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.