Docker to Generate the TensorRT build container

Jetson nano 4gb Developer kit


Jetpack 4.5.1 LRT32.5.1
TensorRT Version:
CUDA Version: 10.2.89
CUDNN Version:
Operating System + Version: 18.04 Ubuntu
Python Version (if applicable): 3.6.9

i’m preparing the jetson for deepstream5.1/sources/objectDetector_Yolo

i followed instructions to download yolo config and weights files
$ ./

Next at GitHub - NVIDIA/TensorRT at v7.0.0

i am executing the following step

and i get following error

what do i do?


Do you use yolo-tlt sample (for TLT model)?
If not, you don’t need to compile the OSS plugin.


yes i plan to use yolo-tlt sample. please could you tell me how do i fix this issue?

Hi @rajkumarsaswamy

This error says /home/rajkumar/docker/ubuntu.Dockerfile not found.

If you wanted to run with these command, you should add your ubuntu.Dockerfile file into /home/rajkumar/docker/


Hi @rajkumarsaswamy

Do you run this container in Jetson Nano?

ubuntu.Dockerfile gets the container from

FROM nvidia/cuda:10.2-cudnn7-devel-ubuntu18.04

The nvidia/cuda container is for x86. For Jetson, you should base container from l4t-base already has CUDA/cuDNN/TensorRT when you run it with --runtime nvidia. If you need CUDA while building the container, set your docker default-runtime to nvidia and reboot: