I successfully trained ssd model using tlt-kit. I got .etlt model. I want to integrate and run this ssd .etlt model on Jetson nano device using deepstream. To run ssd model it requires Tensorrt OSS and custom bounding box parser.
But I am not able to install Tensorrt OSS in Jetson nano device. It is showing this error Could NOT find TENSORRT (missing: TENSORRT_LIBRARY). Please help with this. Thanks
My Jetpack SDK version is 4.5.1
Deepstream version is 5.1
I have tagged and moved this topic to the Jetson Nano category. The Jetson team are great and have tons of experience with Deepstream on Nano, the Jetson community will also jump in and help.
TensorRT OSS is the source of the plugin library (libnvinfer_plugin.so).
You will need to install the TensorRT core library (libnvinfer.so) from the JetPack first.
But I tried everything from scratch from flashing the Jetpack SDK into Jetson nano to Installing TensorRT OSS. Fortunately, I successfully completed all the steps for installing the TensorRT OSS as shown below
Please use the release/7.1 branch to be compatible with JetPack 4.5.1.
After that, please make sure you have replaced the libnvinfer_plugin.so as mentioned below: