Cutomize TensorRT pluggin in Deepstream-6.3

• Hardware Platform: Jetson NX
**• DeepStream Version: 6.3
**• JetPack Version: 5.1.2
**• TensorRT Version: 8.5.2.2
**• hi, I am using Deepstream-6.3 to build AI application. I use image nvcr.io/nvidia/deepstream-l4t:6.3-samples from Nvidia official to build. My problem is that I want to customize TensorRT plugin, and use it for DeepStream container. Previously, I used DeepStream-6.0.1, according to my research, nvidia-container-runtime will mount the entire Cuda library, TensorRT …, from the host to the container.
But when using DeepStream-6.3, the container no longer mounts these libraries, missing cuda_api_runtime.h and the necessary libraries. I want to ask if Deepstream-6.3 has image version that have full library necessary like the previous version when I can mount it from the host or have to download it myself in the container.

image

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

Where do you get this docker?

If you work on Jetson, please use the nvcr.io/nvidia/deepstream:6.3-triton-multiarch.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.