Testing CUDA access using container toolkit does not work

We are trying to run accelerated ML code inside of a docker container on Jetson Orin Nano 8GB Dev-Kit.
We followed the installation tutorial on: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit but were not able to run the sample workload described there.

nvidia-smi call is not available neither inside the ubuntu docker that gets pulled nor on the Jetpack OS. I tried to call tegrastats instead, which works on the host but unfortunately not inside the container.

Can somebody support us on which container we have to pull to get CUDA access working?

Thank you very much.

Hi

Which JetPack version do you use?

You will need to upgrade to JetPack 6 to get the tool.
Running with --runtime nvidia is also required for GPU access inside the container.

Thanks

We are based on Jetpack 6, so this shouldn’t be an issue.

The call I used is directly from your Tutorial:

docker run --rm --runtime=nvidia --gpus all ubuntu nvidia-smi

But why is the Jetpack important for that? From my understanding nvidia-smi should be provided inside the docker guest and should not use executables from the host process?

Hi,

The previous JetPack version doesn’t support nvidia-smi.
Are you able to get the nvidia-smi info after using JetPack 6?

Thanks.

As already said, our system is based on Jetpack 6, but is missing nvidia-smi somehow. I have to check with our Linux developers why it is not available. Either way it’s not clear to me why I need to have nvidia-smi running on my host system to access this inside of my docker container. Can you please elaborate, why this is needed?

Or do we need to run a special version of a ubuntu container to get nvidia-smi?
As i have seen there is a container simply called “ubuntu” on your nvidia ngc catalog. May it be that on your official Jetpack docker resources link to this instead of the official nvidia docker in docker hub?

Hi,

nvidia-smi requires some hardware-related functions.
So you will need to have the corresponding driver to allow nvidia-smi to get access to the info.

Our NGC contains containers for different platforms.
For Jetson, please check for those containing tags or names with the l4t or iGPU keyword.

For example: l4t-cuda is the L4T base image with CUDA preinstalled.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.