Is there a tlt docker compatible with the Jetson Nano 2GB?

I’m building a faster rcnn model from a tlt docker, then make an engine file with tlt-converter on my computer. I transfer that engine to my Jetson Nano 2GB, but I get error messages when I try to run deepstream with that engine.

After a few research the error seems to indicate that the problem comes from the fact that we have a different version of TensorRT and CUDA (and cuDNN) between the tlt docker (TensoRT=7.2, CUDA=11.1) and on the Jeston Nano (TensoRT=7.1, CUDA=10.2).

Is there a docker available with the correct TensoRT/CUDA versions to be compatible with the Jetson Nano or should I make a custom one?

If you want to run inference in your Nano, please copy etlt file into Nano, then use tlt-converter(jetson version) to generate the trt engine in Nano.

For tlt-converter working in your nano (TensoRT=7.1, CUDA=10.2), refer to
see Overview — Transfer Learning Toolkit 3.0 documentation