Could NOT find TENSORRT (missing: TENSORRT_LIBRARY)

I successfully trained ssd model using tlt-kit. I got .etlt model. I want to integrate and run this ssd .etlt model on Jetson nano device using deepstream. To run ssd model it requires Tensorrt OSS and custom bounding box parser.
But I am not able to install Tensorrt OSS in Jetson nano device. It is showing this error Could NOT find TENSORRT (missing: TENSORRT_LIBRARY). Please help with this. Thanks

My Jetpack SDK version is 4.5.1
Deepstream version is 5.1

I have tagged and moved this topic to the Jetson Nano category. The Jetson team are great and have tons of experience with Deepstream on Nano, the Jetson community will also jump in and help.

Hi,

Do you install the TensorRT library?

TensorRT OSS is the source of the plugin library (libnvinfer_plugin.so).
You will need to install the TensorRT core library (libnvinfer.so) from the JetPack first.

Thanks.

I have flashed the JetPack SDK in the Jetson nano and it does show ( libnvinfer.so ). So I think it’s installed.

Hi,

Would you mind to check if the following command helps?

$ cmake .. -DTENSORRT_ROOT=/usr/src/tensorrt

Thanks.

Hi @AastaLLL . Thank you for the reply.
I ran this command $ cmake .. -DTENSORRT_ROOT=/usr/src/tensorrt and it showed this result.

But I tried everything from scratch from flashing the Jetpack SDK into Jetson nano to Installing TensorRT OSS. Fortunately, I successfully completed all the steps for installing the TensorRT OSS as shown below


This generated libnvinfer_plugin.so.7.2.2 as shown below
image

Can I make sure from this that TensorRT OSS is successfully installed on Jetson nano?

If yes, can you please guide me what are the next steps to do in order to run the trained ssd model (.etlt) in Deepstream on Jetson nano?

Thanks.

@AastaLLL Any Update?

Hi,

Please use the release/7.1 branch to be compatible with JetPack 4.5.1.
After that, please make sure you have replaced the libnvinfer_plugin.so as mentioned below:

https://github.com/NVIDIA-AI-IOT/deepstream_tao_apps/tree/release/tao3.0/TRT-OSS/Jetson/TRT7.1#3-replace-libnvinfer_pluginso

Then you can deploy the .etlt model with the sample below:

Thanks.

@AastaLLL I have replaced the libnvinfer_plugin.so . I was asking like how to I create the parser for ssd and deploy .etlt model?

@AastaLLL I am still waiting for your response.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.