Host files for container - nvidia-container-runtime

Hi,

I’m trying to find the minimal set of files to be mapped into the container from the host operating system that would allow all nvidia base images to be run successfully using the nvidia-container-toolkit and docker. This is with Jetpack 36.3.

In the flashing setup I see there is a file named:
Linux_for_Tegra/rootfs/etc/nvidia-container-runtime/host-files-for-container.d/drivers.csv which contains a list of files, directories and symlinks.

Most of the files listed in drivers.csv do exist in the ubuntu rootfs, however, there are some that are missing. The list of missing files is:

usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvarguscamerasrc.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvcompositor.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvdrmvideosink.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnveglglessink.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnveglstreamsrc.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvegltransform.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvipcpipeline.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvivafilter.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvjpeg.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvtee.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvv4l2camerasrc.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvvidconv.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvvideo4linux2.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvvideosink.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvvideosinks.so - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/nvgstcapture-1.0_README.txt - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/nvgstipctestapp-1.0_README.txt - file not found
usr/lib/aarch64-linux-gnu/gstreamer-1.0/nvgstplayer-1.0_README.txt - file not found
usr/lib/aarch64-linux-gnu/libgstnvegl-1.0.so.0 - file not found
usr/lib/aarch64-linux-gnu/libgstnvexifmeta.so - file not found
usr/lib/aarch64-linux-gnu/libgstnvivameta.so - file not found
usr/lib/aarch64-linux-gnu/libnvsample_cudaprocess.so - file not found
usr/lib/aarch64-linux-gnu/nvidia/libgstnvcustomhelper.so.1.0.0 - file not found
usr/lib/aarch64-linux-gnu/nvidia/libgstnvdsseimeta.so.1.0.0 - file not found
usr/lib/aarch64-linux-gnu/nvidia/libnveglstreamproducer.so - file not found
usr/lib/aarch64-linux-gnu/nvidia/libgstnvcustomhelper.so - file not found
usr/lib/aarch64-linux-gnu/nvidia/libgstnvdsseimeta.so - file not found
usr/lib/aarch64-linux-gnu/nvidia/libnvdla_compiler.so - file not found

My questions are:

  1. What exact packages do I need to install in the rootfs, after flashing the device, to bring in the missing files above?
  2. Do the entries in drivers.csv include all inter-library dependencies? That is, there are no other libraries in the csv that are not listed, but are required in the host at runtime by the ones in the csv, even if those dependencies are not bind-mounted in the container?
  3. I’m trying to run the following image as per Specialized Configurations with Docker — NVIDIA Container Toolkit 1.16.0 documentation
sudo docker run --rm --gpus all nvidia/cuda nvidia-smi
Unable to find image 'nvidia/cuda:latest' locally
docker: Error response from daemon: manifest for nvidia/cuda:latest not found: manifest unknown: manifest unknown.
See 'docker run --help'.

CUDA | NVIDIA NGC mentions latest tags have been deprecated, so I tried:

$ sudo docker run --rm --gpus all nvidia/cuda12.6.0-base-ubuntu22.04 nvidia-smi

but am getting:

docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: error during container init'
invoking the NVIDIA Container Runtime Hook directly (e.g. specifying the docker --gpus flag) is not supported. Please use the NVIDIA Container Runtime (e.g. specify the --runtime=nvidia flag) instead.: unknown.

Note that --gpus all works though if I use the ubuntu base image:

$ sudo docker run --rm --runtime=nvidia --gpus all ubuntu nvidia-smi
Mon Aug 26 14:14:36 2024       
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 540.3.0                Driver Version: N/A          CUDA Version: 12.2     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|

Thanks

Hi,

1. You can find these files in the below link:
https://repo.download.nvidia.com/jetson/#Jetpack%206.0

2. For JetPack 6.0, suppose yes.

3. On Jetson, GPU should be enabled with --runtime=nvidia instead of --gpus all.

Thanks.

Thank you,

I found the files in the repo you shared, they are provided by the nvidia-l4t-gstreamer and nvidia-l4t-dla-compiler packages in case it’s useful for some one else

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.