How can I install specific version of Tensorrt on l4t container

I use Jetpack 4.6.3 with Tensorrt 8.2.1. I have a container, inside that i installed pytorch, but can not install tensort 8.2.1 Python api by using pip. How I can do it? Thanks.

Hi,

We are moving this post to the Jetson related forum to get better help.

Thank you.

Hi,

Please try to install our prebuilt PyTorch which can work on Jetson.
You will need to install the package that meets your JetPack version.

https://docs.nvidia.com/deeplearning/frameworks/install-pytorch-jetson-platform/index.html

Thanks.

Thanks. I could install directly Pytorch on Jetson (not in container). Is there anyway to easy install TensorRT python API in Docker container (in the case Docker container does not have Tensorrt Python API at beginning)?

I have this question because I want to check DS and tensort in PC same as DS version and tensorrt version on Jetson. For example, JP4.6.3 on Jetson with tensorrt 8.2.1.9 and DS6.0, but I could not find x86 Docker container with DS6.0 and Tensorrt8.2.1.9.
When I see layers of Deepstream container, it is difficult to know Tensorrt version inside it due to not fully displaying DeepStream | NVIDIA NGC

Hi,

Do you need the exact same TensorRT/Deepstream version between x86 and Jetson?
You can install the TensorRT python package with the following command:

$ sudo apt install python3-libnvinfer*

Thanks.

Thanks. It is for Python. I want to also Tensorrt runtime to run in Deepstream. Is there anyway to check Tensorrt version running in Deepstream?

Hi,

You can run the trtexec binary inside the container.
The binary will display the TensorRT version.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.