NVIDIA L4T TensorRT image for Jetson Orin Nano

Hi NVIDIA Developer

Currently, I create virtual environment in My Jetson Orin Nano 8 GB to run many computer vision models. There are my setup:

  • Jetson Orin Nano Dev 8 GB
  • Jetpack: 5.1.2 (Installed by NVIDIA SDK Manager Method)
  • TensorRT: 8.5.2.2
  • CUDA: 11.4.315

Other information is attached by this image:

Now I would like to change from virtual environment to docker image and container deployments. My questions are

  1. Can I use installed docker or I need to new install docker in my device ? I ask because when I run command docker – version in terminal it shows Docker version 24.0.5, build 24.0.5-0ubuntu1~20.04.1.

Reference: How To Install and Use Docker on Ubuntu 20.04 | DigitalOcean

  1. After finish Docker installation, Which NVIDIA L4T TensorRT image I should pull between l4t-tensorrt:r8.5.2.2-devel and l4t-tensorrt:r8.5.2-runtime to matchs my setups both TensoRT and CUDA versions ?

Reference: NVIDIA L4T TensorRT | NVIDIA NGC

Thanks

Hi,

The installed docker should work.

l4t-tensorrt:r8.5.2-runtime is used for runtime only which means your application is already compiled and only needs to be executed in the environment.
l4t-tensorrt:r8.5.2.2-devel contains header and dev library for developing but the container size is larger.
Please chose one based on your use case.

Thanks.

Thanks for your answers.

I have one more question about NVIDIA Container Toolkit (NCT). After flash Jetpack 5.1.2 to my device do I need to install NCT in my devices or I can use the installed NCT ? and which version of NCT I should install ?

Reference: Installing the NVIDIA Container Toolkit — NVIDIA Container Toolkit 1.14.4 documentation

Hi,

The default nvidia-container-toolkit should work.
Below is our package which is installed with JetPack 5.1.2:

$ sudo apt show nvidia-container-toolkit
Package: nvidia-container-toolkit
Version: 1.11.0~rc.1-1
Priority: optional
Section: utils
Maintainer: NVIDIA CORPORATION <cudatools@nvidia.com>
Installed-Size: 9,586 kB
Depends: libnvidia-container-tools (>= 1.10.0-1), libnvidia-container-tools (<< 2.0.0), libseccomp2
Breaks: nvidia-container-runtime (<= 3.5.0-1), nvidia-container-runtime-hook
Replaces: nvidia-container-runtime (<= 3.5.0-1), nvidia-container-runtime-hook
Homepage: https://github.com/NVIDIA/nvidia-container-toolkit
Download-Size: 1,827 kB
APT-Manual-Installed: no
APT-Sources: https://repo.download.nvidia.com/jetson/common r35.4/main arm64 Packages
Description: NVIDIA Container toolkit
 Provides tools and utilities to enable GPU support in containers.

Thanks.

Thanks for your answers.

I have one more question about nvcc. Why I can’t find nvcc in l4t-tensorrt:r8.5.2-runtime ?

Hi,

You will need to use the dev container for the compiler and headers.

Thanks.

May I ask you something ?

Can I use window 10 host machine (no NVIDIA GPU) to build docker image based on l4t-tensorrt:r8.5.2.2-devel and bring the built image to run in Jetson Orin Nano ?

Hi,

We don’t officially support l4t-based images on a desktop environment, especially Windows.
Moreover, since TensorRT optimizes based on the hardware resources, the engine file needs to be generated on the target device directly.

Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.