Run TLT model on CUDA 11 and tensorRT 7.2

I created a TensorRT engine of the PeopleNet model using the tlt-converter command within this container (CUDA 10 and TensorRT 7.0).

Is there any way I can load the engine in Python running on this container 11 and TensorRT 7.2)? My Python code works fine if I use runs CUDA 10 and TensorRT 7.0 . However, it does not on the other container.

My ultimate goal is to run the TLT models inside a container, but I don’t know if there is a way to export the TLT models using the same CUDA and TensorRT version of the TensorRT container.

Hi @mfoglio,
I believe TLT team should be able to assist better here.
Hence kindly raise the concern in respective forum.