After running apt upgrade on Orin NX, the command jetson-containers run --name ollama $(autotag ollama) fails to start

apt list --upgradable
Listing… Done
cuda-cccl-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.37-1]
cuda-crt-12-6/unknown 12.6.85-1 arm64 [upgradable from: 12.6.68-1]
cuda-cudart-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-cudart-dev-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-cuobjdump-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-cupti-12-6/unknown 12.6.80-1 arm64 [upgradable from: 12.6.68-1]
cuda-cupti-dev-12-6/unknown 12.6.80-1 arm64 [upgradable from: 12.6.68-1]
cuda-cuxxfilt-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-documentation-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-driver-dev-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-gdb-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-nvcc-12-6/unknown 12.6.85-1 arm64 [upgradable from: 12.6.68-1]
cuda-nvdisasm-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-nvml-dev-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-nvprune-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-nvrtc-12-6/unknown 12.6.85-1 arm64 [upgradable from: 12.6.68-1]
cuda-nvrtc-dev-12-6/unknown 12.6.85-1 arm64 [upgradable from: 12.6.68-1]
cuda-nvtx-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-nvvm-12-6/unknown 12.6.85-1 arm64 [upgradable from: 12.6.68-1]
cuda-profiler-api-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-sanitizer-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
cuda-toolkit-12-6-config-common/unknown 12.6.77-1 all [upgradable from: 12.6.68-1]
cuda-toolkit-12-config-common/unknown 12.6.77-1 all [upgradable from: 12.6.68-1]
cuda-toolkit-config-common/unknown 12.6.77-1 all [upgradable from: 12.6.68-1]
libcublas-12-6/unknown 12.6.4.1-1 arm64 [upgradable from: 12.6.1.4-1]
libcublas-dev-12-6/unknown 12.6.4.1-1 arm64 [upgradable from: 12.6.1.4-1]
libcudla-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
libcudla-dev-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
libcudnn9-cuda-12/unknown 9.6.0.74-1 arm64 [upgradable from: 9.3.0.75-1]
libcudnn9-dev-cuda-12/unknown 9.6.0.74-1 arm64 [upgradable from: 9.3.0.75-1]
libcudnn9-samples/unknown 9.6.0.74-1 all [upgradable from: 9.3.0.75-1]
libcufft-12-6/unknown 11.3.0.4-1 arm64 [upgradable from: 11.2.6.59-1]
libcufft-dev-12-6/unknown 11.3.0.4-1 arm64 [upgradable from: 11.2.6.59-1]
libcurand-12-6/unknown 10.3.7.77-1 arm64 [upgradable from: 10.3.7.68-1]
libcurand-dev-12-6/unknown 10.3.7.77-1 arm64 [upgradable from: 10.3.7.68-1]
libcusolver-12-6/unknown 11.7.1.2-1 arm64 [upgradable from: 11.6.4.69-1]
libcusolver-dev-12-6/unknown 11.7.1.2-1 arm64 [upgradable from: 11.6.4.69-1]
libcusparse-12-6/unknown 12.5.4.2-1 arm64 [upgradable from: 12.5.3.3-1]
libcusparse-dev-12-6/unknown 12.5.4.2-1 arm64 [upgradable from: 12.5.3.3-1]
libnvfatbin-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
libnvfatbin-dev-12-6/unknown 12.6.77-1 arm64 [upgradable from: 12.6.68-1]
libnvidia-container-tools/unknown 1.17.3-1 arm64 [upgradable from: 1.14.2-1]
libnvidia-container1/unknown 1.17.3-1 arm64 [upgradable from: 1.14.2-1]
libnvinfer-bin/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-dev/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-dispatch-dev/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-dispatch10/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-headers-dev/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-headers-plugin-dev/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-lean-dev/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-lean10/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-plugin-dev/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-plugin10/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-samples/unknown 10.7.0.23-1+cuda12.6 all [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-vc-plugin-dev/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer-vc-plugin10/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvinfer10/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvjitlink-12-6/unknown 12.6.85-1 arm64 [upgradable from: 12.6.68-1]
libnvjitlink-dev-12-6/unknown 12.6.85-1 arm64 [upgradable from: 12.6.68-1]
libnvonnxparsers-dev/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
libnvonnxparsers10/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
nvidia-container-toolkit-base/unknown 1.17.3-1 arm64 [upgradable from: 1.14.2-1]
nvidia-container-toolkit/unknown 1.17.3-1 arm64 [upgradable from: 1.14.2-1]
python3-libnvinfer-dev/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
python3-libnvinfer-dispatch/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
python3-libnvinfer-lean/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
python3-libnvinfer/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
tensorrt-libs/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]
tensorrt/unknown 10.7.0.23-1+cuda12.6 arm64 [upgradable from: 10.3.0.30-1+cuda12.5]

jetson-containers run --name ollama $(autotag ollama)
Namespace(packages=[‘ollama’], prefer=[‘local’, ‘registry’, ‘build’], disable=[‘’], user=‘dustynv’, output=‘/tmp/autotag’, quiet=False, verbose=False)
– L4T_VERSION=36.4.0 JETPACK_VERSION=6.1 CUDA_VERSION=12.6
– Finding compatible container image for [‘ollama’]
dustynv/ollama:0.5.1-r36.4.0
V4L2_DEVICES: --device /dev/video0 --device /dev/video1 --device /dev/video2

  • docker run --runtime nvidia -it --rm --network host --shm-size=8g --volume /tmp/argus_socket:/tmp/argus_socket --volume /etc/enctune.conf:/etc/enctune.conf --volume /etc/nv_tegra_release:/etc/nv_tegra_release --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model --volume /var/run/dbus:/var/run/dbus --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket --volume /var/run/docker.sock:/var/run/docker.sock --volume /home/suo/projects/test/jetson-containers/data:/data -v /etc/localtime:/etc/localtime:ro -v /etc/timezone:/etc/timezone:ro --device /dev/snd -e PULSE_SERVER=unix:/run/user/1000/pulse/native -v /run/user/1000/pulse:/run/user/1000/pulse --device /dev/bus/usb --device /dev/video0 --device /dev/video1 --device /dev/video2 --device /dev/i2c-0 --device /dev/i2c-1 --device /dev/i2c-2 --device /dev/i2c-4 --device /dev/i2c-5 --device /dev/i2c-7 --device /dev/i2c-9 -v /run/jtop.sock:/run/jtop.sock --name ollama dustynv/ollama:0.5.1-r36.4.0
    docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: error during container init: error running createContainer hook #1: exit status 1, stdout: , stderr: time=“2024-12-26T23:06:01-08:00” level=info msg=“Symlinking /var/lib/docker/overlay2/31978570add95b7f7d7d8565bbfc77f705a301737c4d835a2025f24934e01c88/merged/etc/vulkan/icd.d/nvidia_icd.json to /usr/lib/aarch64-linux-gnu/nvidia/nvidia_icd.json”
    time=“2024-12-26T23:06:01-08:00” level=error msg=“failed to create link [/usr/lib/aarch64-linux-gnu/nvidia/nvidia_icd.json /etc/vulkan/icd.d/nvidia_icd.json]: failed to create symlink: failed to remove existing file: remove /var/lib/docker/overlay2/31978570add95b7f7d7d8565bbfc77f705a301737c4d835a2025f24934e01c88/merged/etc/vulkan/icd.d/nvidia_icd.json: device or resource busy”: unknown.

There is no update from you for a period, assuming this is not an issue anymore.
Hence, we are closing this topic. If need further support, please open a new one.
Thanks

Hi,
Could you set up a clean-flashed Orin NX developer kit(Jetpack 6.1) and try the same command:

$ jetson-containers run --name ollama $(autotag ollama)

Would like to know if the command works on a clean system.