Nvml errors when attempting to use docker with Nvidia Container Toolkit

I am attempting to run some AI programs on my new desktop computer using docker. The computer is running Ubuntu Server V24.04.

I have installed all the necessary nvidia drivers, and I have also installed the nvidia-container-toolkit as described at:

https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/installa-guide.html#installing-with-apt

Unfortunately, I am having difficulty starting up my containers. They keep exiting with the following error:

 nvidia-container-cli: initialization error: nvml error: driver not loaded: unknown

I have looked around the Internet and have seen that this failure has been occurring with a number of people. Unfortunately, just about everyone who has that problem has solved it by installing certain nvidia drivers (which I installed) and installing the nvidia container toolkit! All of those solutions worked with earlier versions of Ubuntu. None of them worked for V24.04.

Has anyone else seen this problem? If so, has anyone found a solution to it?

I am also having the same problem. Installed Docker Engine on my WSL install of Ubuntu 24.04, then installed the Nvidia Drivers, then installed the Nvidia Container Toolkit following ( CUDA Toolkit 12.6 Update 2 Downloads | NVIDIA Developer).

Attempting to run a container using docker compose yield the “nvidia-container-cli: error parsing IMEX info: unsupported IMEX channel value: all: unknown” message however running containers with commands like “docker run -it --rm --gpus all ubuntu nvidia-smi” does return the expect nvidia-smi output.