Criminals (tensorrt 7 container) wanted

I want to find a docker container such that

  • for Orin devkit
  • contains libnvinfer-dev (tensorrt) 7.x
    (contains libtorch)

I have tried nvcr.io/nvidia/l4t-pytorch:r35.1.0-pth1.13-py3 and nvcr.io/nvidia/l4t-pytorch:r34.1.1-pth1.11-py3. But they have libnvinfer-dev (tensorrt) 8.x

Thank you very much!!

Hi,

Sorry that we start Orin support from JetPack 5.x and it includes TensorRT 8.x.
Could you share the reason why you need the older TensorRT version?

Thanks.

Hi, Thank you for the quick answer.

We need tensorrt 7 because the S/W framework we base on only supports tensorrt 7. The API between tensorrt 7 and 8 seemed to be different enough, I don’t know how much different though.

So I believed easier approach for us would be downgrading tansorrt from 8 to 7 so that our SW compiles easily.

I have one new question : Can I install tensorrt 7 on ubuntu 18.04 on Orin?
What about cuda? Can I install cuda 11.2 on the same ubuntu 18.04 on Orin?
This way, do I use jetpack? or I must not use jetpack?
Is jetpack a mendatory rule for jetson device?

Do we only have to rewrite our code for tensorrt 8 to meet the requirement of Orin?

Thanks

Hi,

The root cause is that there are some dependencies between the GPU driver and CUDA libraries.
For Jetson, the GPU driver is integrated into the L4T.

Since we start Orin support from the r35 branch(20.04), you will need to use CUDA 11.4 and TensorRT 8.x.
Thanks.

1 Like

It turned out that the criminal was innocent because he/she hasn’t been existed at all…
Thanks!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.