Isaac_ros (Jetpack 5.x / Humble) - container won't start error

After setting up Jetson Orin through SDK Manager, I ran the container setup through isaac_ros_common/scripts; after ‘successful install’, the container can not run and give the following errors:

----Error Start
Successfully tagged isaac_ros_dev-aarch64:latest
Running isaac_ros_dev-aarch64-container
docker: Error response from daemon: failed to create shim: OCI runtime create failed: runc create failed: unable to start container process: error during container init: error running hook #0: error running hook: exit status 1, stdout: src: /etc/vulkan/icd.d/nvidia_icd.json, src_lnk: /usr/lib/aarch64-linux-gnu/tegra/nvidia_icd.json, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/etc/vulkan/icd.d/nvidia_icd.json, dst_lnk: /usr/lib/aarch64-linux-gnu/tegra/nvidia_icd.json
src: /usr/lib/aarch64-linux-gnu/libcuda.so, src_lnk: tegra/libcuda.so, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/libcuda.so, dst_lnk: tegra/libcuda.so
src: /usr/lib/aarch64-linux-gnu/libnvcucompat.so, src_lnk: tegra/libnvcucompat.so, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/libnvcucompat.so, dst_lnk: tegra/libnvcucompat.so
src: /usr/lib/aarch64-linux-gnu/libnvidia-nvvm.so, src_lnk: tegra/libnvidia-nvvm.so, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/libnvidia-nvvm.so, dst_lnk: tegra/libnvidia-nvvm.so
src: /usr/lib/aarch64-linux-gnu/libv4l2.so.0.0.999999, src_lnk: tegra/libnvv4l2.so, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/libv4l2.so.0.0.999999, dst_lnk: tegra/libnvv4l2.so
src: /usr/lib/aarch64-linux-gnu/libv4lconvert.so.0.0.999999, src_lnk: tegra/libnvv4lconvert.so, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/libv4lconvert.so.0.0.999999, dst_lnk: tegra/libnvv4lconvert.so
src: /usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvargus.so, src_lnk: …/…/…/tegra/libv4l2_nvargus.so, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvargus.so, dst_lnk: …/…/…/tegra/libv4l2_nvargus.so
src: /usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvcuvidvideocodec.so, src_lnk: …/…/…/tegra/libv4l2_nvcuvidvideocodec.so, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvcuvidvideocodec.so, dst_lnk: …/…/…/tegra/libv4l2_nvcuvidvideocodec.so
src: /usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvvideocodec.so, src_lnk: …/…/…/tegra/libv4l2_nvvideocodec.so, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvvideocodec.so, dst_lnk: …/…/…/tegra/libv4l2_nvvideocodec.so
src: /usr/lib/aarch64-linux-gnu/libvulkan.so.1.3.203, src_lnk: tegra/libvulkan.so.1.3.203, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/libvulkan.so.1.3.203, dst_lnk: tegra/libvulkan.so.1.3.203
src: /usr/lib/aarch64-linux-gnu/tegra/libcuda.so, src_lnk: libcuda.so.1.1, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libcuda.so, dst_lnk: libcuda.so.1.1
src: /usr/lib/aarch64-linux-gnu/tegra/libgstnvdsseimeta.so, src_lnk: libgstnvdsseimeta.so.1.0.0, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libgstnvdsseimeta.so, dst_lnk: libgstnvdsseimeta.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvbufsurface.so, src_lnk: libnvbufsurface.so.1.0.0, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvbufsurface.so, dst_lnk: libnvbufsurface.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvbufsurftransform.so, src_lnk: libnvbufsurftransform.so.1.0.0, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvbufsurftransform.so, dst_lnk: libnvbufsurftransform.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvbuf_utils.so, src_lnk: libnvbuf_utils.so.1.0.0, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvbuf_utils.so, dst_lnk: libnvbuf_utils.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvdsbufferpool.so, src_lnk: libnvdsbufferpool.so.1.0.0, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvdsbufferpool.so, dst_lnk: libnvdsbufferpool.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvidia-nvvm.so, src_lnk: libnvidia-nvvm.so.4, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvidia-nvvm.so, dst_lnk: libnvidia-nvvm.so.4
src: /usr/lib/aarch64-linux-gnu/tegra/libnvidia-nvvm.so.4, src_lnk: libnvidia-nvvm.so.4.0.0, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvidia-nvvm.so.4, dst_lnk: libnvidia-nvvm.so.4.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvid_mapper.so, src_lnk: libnvid_mapper.so.1.0.0, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvid_mapper.so, dst_lnk: libnvid_mapper.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvscibuf.so, src_lnk: libnvscibuf.so.1, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvscibuf.so, dst_lnk: libnvscibuf.so.1
src: /usr/lib/aarch64-linux-gnu/tegra/libnvscicommon.so, src_lnk: libnvscicommon.so.1, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvscicommon.so, dst_lnk: libnvscicommon.so.1
src: /usr/lib/aarch64-linux-gnu/tegra/libnvscistream.so, src_lnk: libnvscistream.so.1, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvscistream.so, dst_lnk: libnvscistream.so.1
src: /usr/lib/aarch64-linux-gnu/tegra/libnvscisync.so, src_lnk: libnvscisync.so.1, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/lib/aarch64-linux-gnu/tegra/libnvscisync.so, dst_lnk: libnvscisync.so.1
src: /usr/share/glvnd/egl_vendor.d/10_nvidia.json, src_lnk: …/…/…/lib/aarch64-linux-gnu/tegra-egl/nvidia.json, dst: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/usr/share/glvnd/egl_vendor.d/10_nvidia.json, dst_lnk: …/…/…/lib/aarch64-linux-gnu/tegra-egl/nvidia.json
, stderr: nvidia-container-cli: mount error: file creation failed: /var/lib/docker/overlay2/dd9670e59decf3c452e027b835747ecc7a13a4fd0650923953ad2aeae62fe5c2/merged/dev/nvhost-as-gpu: invalid argument: unknown.
~/workspaces/isaac_ros_dev/src/isaac_ros_common

—Error End

apt-get update also have error related to cuda


Software Updater

The package system is broken

Check if you are using third party repositories. If so disable them, since they are a common source of problems.
Furthermore run the following command in a Terminal: apt-get install -f
Transaction failed: The package system is broken
The following packages have unmet dependencies:

cuda-libraries-dev-11-4: Depends: cuda-cudart-dev-11-4 (>= 11.4.243) but 11.4.243-1 is installed
Depends: cuda-cccl-11-4 (>= 11.4.222) but 11.4.222-1 is installed
Depends: cuda-profiler-api-11-4 (>= 11.4.239) but it is not installed
Depends: cuda-driver-dev-11-4 (>= 11.4.243) but 11.4.243-1 is installed
Depends: cuda-nvrtc-dev-11-4 (>= 11.4.239) but 11.4.239-1 is installed
Depends: libcublas-dev-11-4 (>= 11.6.6.23) but 11.6.6.23-1 is installed
Depends: libcudla-dev-11-4 (>= 11.4.239) but 11.4.239-1 is installed
Depends: libcusparse-dev-11-4 (>= 11.6.0.238) but 11.6.0.238-1 is installed

I have tried the Google Search solutions and tried starting the docker container from scratch twice but same errors.strong text

sudo apt --fix-broken install -o Dpkg::Options::=“–force-overwrite”

Partial fix … can update and upgrade BUT still the intial problem with launching the container (first error)

Thanks

Able to install

sudo apt update
sudo apt dist-upgrade
sudo reboot
sudo apt install nvidia-jetpack

Still docker error
docker: Error response from daemon: failed to create shim: OCI runtime created failed: …

jetson@orin:~/workspaces/isaac_ros_dev/src/isaac_ros_common$ docker version
Client:
Version: 20.10.12
API version: 1.41
Go version: go1.16.2
Git commit: 20.10.12-0ubuntu2~20.04.1
Built: Wed Apr 6 02:16:12 2022
OS/Arch: linux/arm64
Context: default
Experimental: true

Server:
Engine:
Version: 20.10.12
API version: 1.41 (minimum version 1.12)
Go version: go1.16.2
Git commit: 20.10.12-0ubuntu2~20.04.1
Built: Thu Feb 10 15:03:35 2022
OS/Arch: linux/arm64
Experimental: false
containerd:
Version: 1.5.9-0ubuntu1~20.04.4
GitCommit:
nvidia:
Version: 1.1.0-0ubuntu1~20.04.1
GitCommit:
docker-init:
Version: 0.19.0
GitCommit:

Current State


jetson@orin:~/workspaces/isaac_ros_dev/src/isaac_ros_common$ ./scripts/run_dev.sh
isaac_ros_dev not specified, assuming /home/jetson/workspaces/isaac_ros_dev/src/isaac_ros_common
~/workspaces/isaac_ros_dev/src/isaac_ros_common ~/workspaces/isaac_ros_dev/src/isaac_ros_common
Building aarch64.humble.nav2.user base as image: isaac_ros_dev-aarch64 using key aarch64.humble.nav2.user
Using base image name not specified, using ‘’
Using docker context dir not specified, using Dockerfile directory
Resolved the following Dockerfiles for target image: aarch64.humble.nav2.user
/home/jetson/workspaces/isaac_ros_dev/src/isaac_ros_common/scripts/…/docker/Dockerfile.user
/home/jetson/workspaces/isaac_ros_dev/src/isaac_ros_common/scripts/…/docker/Dockerfile.nav2
/home/jetson/workspaces/isaac_ros_dev/src/isaac_ros_common/scripts/…/docker/Dockerfile.humble
/home/jetson/workspaces/isaac_ros_dev/src/isaac_ros_common/scripts/…/docker/Dockerfile.aarch64
Building /home/jetson/workspaces/isaac_ros_dev/src/isaac_ros_common/scripts/…/docker/Dockerfile.aarch64 as image: aarch64-image with base:
Sending build context to Docker daemon 76.29kB
Step 1/37 : ARG BASE_IMAGE=“nvcr.io/nvidia/l4t-base:r34.1
Step 2/37 : FROM ${BASE_IMAGE}
—> 49ef3d02f844
Step 3/37 : ENV DEBIAN_FRONTEND=noninteractive
—> Using cache
—> 459bdf781b6f
Step 4/37 : ENV SHELL /bin/bash
—> Using cache
—> 427d7ca9bc80
Step 5/37 : SHELL [“/bin/bash”, “-c”]
—> Using cache
—> c2662d020edf
Step 6/37 : RUN apt-get update && apt-get install -y build-essential cmake curl git lsb-release sudo tar unzip vim wget libgoogle-glog-dev software-properties-common && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 97de5ad6ad06
Step 7/37 : RUN update-alternatives --install /usr/bin/python python /usr/bin/python3 1
—> Using cache
—> 1e57d03a56e5
Step 8/37 : RUN apt-get update && apt-get install -y python3-flake8 python3-pip python3-pytest-cov python3-setuptools && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> c4ab86e899fc
Step 9/37 : RUN apt-get update && apt-get install -y libavformat-dev libjpeg-dev libopenjp2-7-dev libpng-dev libpq-dev libswscale-dev libtbb2 libtbb-dev libtiff-dev pkg-config yasm && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 0eaf686bb364
Step 10/37 : RUN apt-get update && apt-get install -y python3-distutils libboost-all-dev libboost-dev libpcl-dev libode-dev lcov python3-zmq libxaw7-dev libgraphicsmagick++1-dev graphicsmagick-libmagick-dev-compat libceres-dev libsuitesparse-dev libncurses5-dev libassimp-dev libyaml-cpp-dev libpcap-dev && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> ac6a89515c34
Step 11/37 : RUN apt-get update && apt-get install -y gfortran libatlas-base-dev python3-scipy && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> e677aa8b4d01
Step 12/37 : RUN python3 -m pip install -U Cython wheel
—> Using cache
—> 656f9f7df632
Step 13/37 : RUN python3 -m pip install -U scikit-learn
—> Using cache
—> ac7a28377c7b
Step 14/37 : RUN curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash && apt-get update && apt-get install -y git-lfs && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 255d4192ed4a
Step 15/37 : RUN wget -O - https://apt.kitware.com/keys/kitware-archive-latest.asc 2>/dev/null | gpg --dearmor - | sudo tee /usr/share/keyrings/kitware-archive-keyring.gpg >/dev/null && echo ‘deb [signed-by=/usr/share/keyrings/kitware-archive-keyring.gpg] https://apt.kitware.com/ubuntu/ bionic main’ | sudo tee /etc/apt/sources.list.d/kitware.list >/dev/null && apt-get update && rm /usr/share/keyrings/kitware-archive-keyring.gpg && apt-get install -y kitware-archive-keyring && apt-get install -y cmake && cmake --version && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 175dd1aa2b65
Step 16/37 : RUN apt-get update && apt-get install -y tensorrt vpi2-dev && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> e3bc5c8583a4
Step 17/37 : COPY tao/tao-converter-aarch64-tensorrt8.4.zip /opt/nvidia/tao/tao-converter-aarch64-tensorrt8.4.zip
—> Using cache
—> bd5d37f4b6ac
Step 18/37 : RUN mkdir -p /opt/nvidia/tao && cd /opt/nvidia/tao && unzip -j tao-converter-aarch64-tensorrt8.4.zip -d /opt/nvidia/tao/jp5 && chmod 755 $(find /opt/nvidia/tao -name “tao-converter”) && ln -sf $(find /opt/nvidia/tao -name “tao-converter”) /opt/nvidia/tao/tao-converter && rm tao-converter-aarch64-tensorrt8.4.zip
—> Using cache
—> 03fb86a0ae7a
Step 19/37 : ENV PATH=“${PATH}:/opt/nvidia/tao”
—> Using cache
—> f47a7ef9eb1d
Step 20/37 : ENV LD_LIBRARY_PATH=“/opt/nvidia/vpi2/lib64:${LD_LIBRARY_PATH}”
—> Using cache
—> cc04ac97dca9
Step 21/37 : ENV LD_LIBRARY_PATH=“/usr/lib/aarch64-linux-gnu/tegra:${LD_LIBRARY_PATH}”
—> Using cache
—> 8f2211fa09e0
Step 22/37 : ENV LD_LIBRARY_PATH=“/usr/local/cuda-11.4/targets/aarch64-linux/lib:${LD_LIBRARY_PATH}”
—> Using cache
—> de0d4898ab64
Step 23/37 : ENV LD_LIBRARY_PATH=“/usr/lib/aarch64-linux-gnu/tegra-egl:${LD_LIBRARY_PATH}”
—> Using cache
—> 9cc41151b0a9
Step 24/37 : ENV LD_LIBRARY_PATH=“${LD_LIBRARY_PATH}:/usr/lib/aarch64-linux-gnu-host”
—> Using cache
—> 2ce8f5f7c7a5
Step 25/37 : ENV PATH=“${PATH}:/usr/local/cuda/bin”
—> Using cache
—> 663044823865
Step 26/37 : ENV LD_PRELOAD=“/usr/lib/aarch64-linux-gnu/libgomp.so.1”
—> Using cache
—> 10edf28ac81a
Step 27/37 : ENV RMW_IMPLEMENTATION=rmw_fastrtps_cpp
—> Using cache
—> 3b4962018988
Step 28/37 : RUN python3 -m pip install -U --extra-index-url https://download.pytorch.org/whl/cu113 torch torchvision torchaudio
—> Using cache
—> e9604bf3730a
Step 29/37 : RUN apt-get update && apt-get install -y --no-install-recommends autoconf automake libb64-dev libcurl4-openssl-dev libopenblas-dev libre2-dev libssl-dev libtool patchelf rapidjson-dev zlib1g-dev && rm -rf /var/lib/apt/lists/*
—> Using cache
—> e7e77a887060
Step 30/37 : RUN mkdir -p /opt/tritonserver && cd /opt/tritonserver && wget https://github.com/triton-inference-server/server/releases/download/v2.20.0/tritonserver2.20.0-jetpack5.0.tgz && tar -xzvf tritonserver2.20.0-jetpack5.0.tgz && rm tritonserver2.20.0-jetpack5.0.tgz
—> Using cache
—> 66caa9100eb1
Step 31/37 : ENV LD_LIBRARY_PATH=“${LD_LIBRARY_PATH}:/opt/tritonserver/lib”
—> Using cache
—> 060d12aec900
Step 32/37 : RUN apt-add-repository ppa:mosquitto-dev/mosquitto-ppa && apt-get update && apt-get install -y mosquitto mosquitto-clients
—> Using cache
—> 671eac4a776d
Step 33/37 : RUN python3 -m pip install -U pymongo paho-mqtt
—> Using cache
—> 134222940893
Step 34/37 : RUN apt-get update && apt-get install -y libasio-dev libbullet-dev libtinyxml2-dev libcunit1-dev libopencv-dev python3-opencv && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 815481ff1594
Step 35/37 : RUN apt-get update && apt-get install -y --no-install-recommends libnpp-dev-11-4 && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 2e7d495dd080
Step 36/37 : RUN apt-get update && apt-get install -y --only-upgrade linux-libc-dev && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> b2434c945643
Step 37/37 : RUN python3 -m pip install protobuf==3.20.1
—> Using cache
—> fbbf478e27b4
[Warning] One or more build-args [USER_GID USER_UID USERNAME] were not consumed
Successfully built fbbf478e27b4
Successfully tagged aarch64-image:latest
Building /home/jetson/workspaces/isaac_ros_dev/src/isaac_ros_common/scripts/…/docker/Dockerfile.humble as image: humble-image with base: aarch64-image
Sending build context to Docker daemon 76.29kB
Step 1/19 : ARG BASE_IMAGE
Step 2/19 : FROM ${BASE_IMAGE}
—> fbbf478e27b4
Step 3/19 : RUN locale-gen en_US en_US.UTF-8
—> Using cache
—> ffde1cbddce3
Step 4/19 : RUN update-locale LC_ALL=en_US.UTF-8 LANG=en_US.UTF-8
—> Using cache
—> 4ffb842b84d1
Step 5/19 : ENV LANG=en_US.UTF-8
—> Using cache
—> fd83acb4735e
Step 6/19 : ENV ROS_PYTHON_VERSION=3
—> Using cache
—> 6935e25e1962
Step 7/19 : RUN apt-get update && apt-get install -y curl gnupg lsb-release && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> fa458476db39
Step 8/19 : RUN curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.key -o /usr/share/keyrings/ros-archive-keyring.gpg
—> Using cache
—> ae6475606671
Step 9/19 : RUN echo “deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] http://packages.ros.org/ros2-testing/ubuntu $(source /etc/os-release && echo $UBUNTU_CODENAME) main” | tee /etc/apt/sources.list.d/ros2.list > /dev/null
—> Using cache
—> b903c035e298
Step 10/19 : RUN apt-get update && apt-get install -y build-essential cmake git python3-colcon-common-extensions python3-flake8 python3-pip python3-pybind11 python3-pytest-cov python3-rosdep python3-rosinstall-generator python3-setuptools python3-vcstool wget && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> b639b9abd5fd
Step 11/19 : RUN python3 -m pip install -U flake8-blind-except flake8-builtins flake8-class-newline flake8-comprehensions flake8-deprecated flake8-docstrings flake8-import-order flake8-quotes pytest-repeat pytest-rerunfailures pytest setuptools
—> Using cache
—> 3737392b7469
Step 12/19 : ENV ROS_DISTRO=humble
—> Using cache
—> 772cdb49ea02
Step 13/19 : ENV ROS_ROOT=/opt/ros/${ROS_DISTRO}
—> Using cache
—> d2065c0cbb04
Step 14/19 : RUN mkdir -p ${ROS_ROOT}/src && cd ${ROS_ROOT} && rosinstall_generator --deps --rosdistro ${ROS_DISTRO} ros_base angles apriltag behaviortree_cpp_v3 bondcpp camera_calibration_parsers camera_info_manager compressed_image_transport compressed_depth_image_transport cv_bridge demo_nodes_cpp demo_nodes_py diagnostic_updater example_interfaces image_geometry image_pipeline image_transport image_transport_plugins launch_xml launch_yaml launch_testing launch_testing_ament_cmake nav2_msgs ompl resource_retriever rosbridge_suite rqt_image_view rviz2 sensor_msgs slam_toolbox v4l2_camera vision_opencv vision_msgs > ros2.${ROS_DISTRO}.ros_base.rosinstall && cat ros2.${ROS_DISTRO}.ros_base.rosinstall && vcs import src < ros2.${ROS_DISTRO}.ros_base.rosinstall && rm ${ROS_ROOT}/.rosinstall
—> Using cache
—> 742df26ae3ec
Step 15/19 : RUN cd ${ROS_ROOT} && apt-get update && rosdep init && rosdep update && rosdep install -y --ignore-src --from-paths src --rosdistro ${ROS_DISTRO} --skip-keys “fastcdr rti-connext-dds-6.0.1 rti-connext-dds-5.3.1 urdfdom_headers libopencv-dev libopencv-contrib-dev libopencv-imgproc-dev python-opencv python3-opencv” && rm -Rf /var/lib/apt/lists/
&& apt-get clean
—> Using cache
—> c8067f22fbb6
Step 16/19 : RUN cd ${ROS_ROOT} && colcon build --merge-install --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo --packages-up-to behaviortree_cpp_v3 && colcon build --merge-install --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo && rm -Rf src build log
—> Using cache
—> c74c805fb295
Step 17/19 : RUN echo “source /opt/ros/${ROS_DISTRO}/install/setup.bash ; export ROS_DISTRO=${ROS_DISTRO}” > /opt/ros/${ROS_DISTRO}/setup.bash
—> Using cache
—> 653e41da8901
Step 18/19 : ENV RMW_IMPLEMENTATION=rmw_fastrtps_cpp
—> Using cache
—> f5d49ad0fd34
Step 19/19 : RUN apt-get update && mkdir -p ${ROS_ROOT}/src && cd ${ROS_ROOT}/src && git clone https://github.com/osrf/negotiated && cd negotiated && git checkout master && cd … && source ${ROS_ROOT}/setup.bash && cd ${ROS_ROOT} && rosdep install -y -r --ignore-src --from-paths src --rosdistro ${ROS_DISTRO} && colcon build --merge-install --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo --packages-up-to-regex negotiated* && rm -Rf src logs build && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 1907c5b37eb2
[Warning] One or more build-args [USERNAME USER_GID USER_UID] were not consumed
Successfully built 1907c5b37eb2
Successfully tagged humble-image:latest
Building /home/jetson/workspaces/isaac_ros_dev/src/isaac_ros_common/scripts/…/docker/Dockerfile.nav2 as image: nav2-image with base: humble-image
Sending build context to Docker daemon 76.29kB
Step 1/5 : ARG BASE_IMAGE
Step 2/5 : FROM ${BASE_IMAGE}
—> 1907c5b37eb2
Step 3/5 : RUN apt-get update && mkdir -p ${ROS_ROOT}/src && cd ${ROS_ROOT}/src && git clone https://github.com/ros-planning/navigation2.git && cd navigation2 && git checkout humble && cd … && git clone https://github.com/BehaviorTree/BehaviorTree.CPP.git && cd BehaviorTree.CPP && git checkout master && cd … && source ${ROS_ROOT}/setup.bash && cd ${ROS_ROOT} && rosdep install -y -r --ignore-src --from-paths src --rosdistro ${ROS_DISTRO} && colcon build --merge-install --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo --packages-up-to-regex nav2* --packages-ignore nav2_system_tests && rm -Rf src logs build && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 0b17ca340d46
Step 4/5 : RUN apt-get update && mkdir -p ${ROS_ROOT}/src && cd ${ROS_ROOT}/src && git clone https://github.com/ipa320/vda5050_msgs.git vda5050_root && cd vda5050_root && git checkout ros2 && cd … && mv vda5050_root/vda5050_msgs/ vda5050_msgs && rm -rf vda5050_root && source ${ROS_ROOT}/setup.bash && cd ${ROS_ROOT} && colcon build --merge-install --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo --packages-up-to vda5050_msgs && rm -Rf src logs build && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 1d45226d1f88
Step 5/5 : RUN apt-get update && mkdir -p ${ROS_ROOT}/src && cd ${ROS_ROOT}/src && git clone https://github.com/RobotWebTools/rosbridge_suite.git && cd rosbridge_suite && git checkout ros2 && cd … && source ${ROS_ROOT}/setup.bash && cd ${ROS_ROOT} && rosdep install -y -r --ignore-src --from-paths src --rosdistro ${ROS_DISTRO} && colcon build --merge-install --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo --packages-up-to rosbridge_library && rm -Rf src logs build && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 8d8b8a838d4d
[Warning] One or more build-args [USERNAME USER_GID USER_UID] were not consumed
Successfully built 8d8b8a838d4d
Successfully tagged nav2-image:latest
Building /home/jetson/workspaces/isaac_ros_dev/src/isaac_ros_common/scripts/…/docker/Dockerfile.user as image: isaac_ros_dev-aarch64 with base: nav2-image
Sending build context to Docker daemon 76.29kB
Step 1/15 : ARG BASE_IMAGE
Step 2/15 : FROM ${BASE_IMAGE}
—> 8d8b8a838d4d
Step 3/15 : ARG USERNAME=admin
—> Using cache
—> df95c735ac2c
Step 4/15 : ARG USER_UID=1000
—> Using cache
—> 3f23d566ec37
Step 5/15 : ARG USER_GID=1000
—> Using cache
—> 9fe74622fbcf
Step 6/15 : RUN apt-get update && apt-get install -y sudo && rm -rf /var/lib/apt/lists/* && apt-get clean
—> Using cache
—> 4f3d5c23e5f1
Step 7/15 : RUN if [ $(getent group triton-server) ]; then groupmod --gid ${USER_GID} -n ${USERNAME} triton-server ; usermod -l ${USERNAME} -m -d /home/${USERNAME} triton-server ; mkdir -p /home/${USERNAME} ; sudo chown ${USERNAME}:${USERNAME} /home/${USERNAME} ; fi
—> Using cache
—> 24b714f63c1b
Step 8/15 : RUN if [ ! $(getent passwd ${USERNAME}) ]; then groupadd --gid ${USER_GID} ${USERNAME} ; useradd --uid ${USER_UID} --gid ${USER_GID} -m ${USERNAME} ; fi
—> Using cache
—> c441dbeba653
Step 9/15 : RUN echo ${USERNAME} ALL=(root) NOPASSWD:ALL > /etc/sudoers.d/${USERNAME} && chmod 0440 /etc/sudoers.d/${USERNAME} && adduser ${USERNAME} video && adduser ${USERNAME} sudo
—> Using cache
—> b47fdbcc21d4
Step 10/15 : RUN mkdir -p /usr/local/bin/scripts
—> Using cache
—> 800d9296c2ae
Step 11/15 : COPY scripts/entrypoint.sh /usr/local/bin/scripts/
—> Using cache
—> 2aad7a9e5d07
Step 12/15 : RUN chmod +x /usr/local/bin/scripts/
.sh
—> Using cache
—> c7aed54e2294
Step 13/15 : ENV USERNAME=${USERNAME}
—> Using cache
—> 938a0379a787
Step 14/15 : ENV USER_GID=${USER_GID}
—> Using cache
—> 364dde92eae5
Step 15/15 : ENV USER_UID=${USER_UID}
—> Using cache
—> 8127f64b861b
Successfully built 8127f64b861b
Successfully tagged isaac_ros_dev-aarch64:latest
Running isaac_ros_dev-aarch64-container
docker: Error response from daemon: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: error during container init: error running hook #0: error running hook: exit status 1, stdout: src: /etc/vulkan/icd.d/nvidia_icd.json, src_lnk: /usr/lib/aarch64-linux-gnu/tegra/nvidia_icd.json, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/etc/vulkan/icd.d/nvidia_icd.json, dst_lnk: /usr/lib/aarch64-linux-gnu/tegra/nvidia_icd.json
src: /usr/lib/aarch64-linux-gnu/libcuda.so, src_lnk: tegra/libcuda.so, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/libcuda.so, dst_lnk: tegra/libcuda.so
src: /usr/lib/aarch64-linux-gnu/libnvcucompat.so, src_lnk: tegra/libnvcucompat.so, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/libnvcucompat.so, dst_lnk: tegra/libnvcucompat.so
src: /usr/lib/aarch64-linux-gnu/libnvidia-nvvm.so, src_lnk: tegra/libnvidia-nvvm.so, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/libnvidia-nvvm.so, dst_lnk: tegra/libnvidia-nvvm.so
src: /usr/lib/aarch64-linux-gnu/libv4l2.so.0.0.999999, src_lnk: tegra/libnvv4l2.so, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/libv4l2.so.0.0.999999, dst_lnk: tegra/libnvv4l2.so
src: /usr/lib/aarch64-linux-gnu/libv4lconvert.so.0.0.999999, src_lnk: tegra/libnvv4lconvert.so, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/libv4lconvert.so.0.0.999999, dst_lnk: tegra/libnvv4lconvert.so
src: /usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvargus.so, src_lnk: …/…/…/tegra/libv4l2_nvargus.so, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvargus.so, dst_lnk: …/…/…/tegra/libv4l2_nvargus.so
src: /usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvcuvidvideocodec.so, src_lnk: …/…/…/tegra/libv4l2_nvcuvidvideocodec.so, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvcuvidvideocodec.so, dst_lnk: …/…/…/tegra/libv4l2_nvcuvidvideocodec.so
src: /usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvvideocodec.so, src_lnk: …/…/…/tegra/libv4l2_nvvideocodec.so, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/libv4l/plugins/nv/libv4l2_nvvideocodec.so, dst_lnk: …/…/…/tegra/libv4l2_nvvideocodec.so
src: /usr/lib/aarch64-linux-gnu/libvulkan.so.1.3.203, src_lnk: tegra/libvulkan.so.1.3.203, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/libvulkan.so.1.3.203, dst_lnk: tegra/libvulkan.so.1.3.203
src: /usr/lib/aarch64-linux-gnu/tegra/libcuda.so, src_lnk: libcuda.so.1.1, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libcuda.so, dst_lnk: libcuda.so.1.1
src: /usr/lib/aarch64-linux-gnu/tegra/libgstnvdsseimeta.so, src_lnk: libgstnvdsseimeta.so.1.0.0, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libgstnvdsseimeta.so, dst_lnk: libgstnvdsseimeta.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvbufsurface.so, src_lnk: libnvbufsurface.so.1.0.0, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvbufsurface.so, dst_lnk: libnvbufsurface.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvbufsurftransform.so, src_lnk: libnvbufsurftransform.so.1.0.0, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvbufsurftransform.so, dst_lnk: libnvbufsurftransform.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvbuf_utils.so, src_lnk: libnvbuf_utils.so.1.0.0, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvbuf_utils.so, dst_lnk: libnvbuf_utils.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvdsbufferpool.so, src_lnk: libnvdsbufferpool.so.1.0.0, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvdsbufferpool.so, dst_lnk: libnvdsbufferpool.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvidia-nvvm.so, src_lnk: libnvidia-nvvm.so.4, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvidia-nvvm.so, dst_lnk: libnvidia-nvvm.so.4
src: /usr/lib/aarch64-linux-gnu/tegra/libnvidia-nvvm.so.4, src_lnk: libnvidia-nvvm.so.4.0.0, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvidia-nvvm.so.4, dst_lnk: libnvidia-nvvm.so.4.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvid_mapper.so, src_lnk: libnvid_mapper.so.1.0.0, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvid_mapper.so, dst_lnk: libnvid_mapper.so.1.0.0
src: /usr/lib/aarch64-linux-gnu/tegra/libnvscibuf.so, src_lnk: libnvscibuf.so.1, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvscibuf.so, dst_lnk: libnvscibuf.so.1
src: /usr/lib/aarch64-linux-gnu/tegra/libnvscicommon.so, src_lnk: libnvscicommon.so.1, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvscicommon.so, dst_lnk: libnvscicommon.so.1
src: /usr/lib/aarch64-linux-gnu/tegra/libnvscistream.so, src_lnk: libnvscistream.so.1, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvscistream.so, dst_lnk: libnvscistream.so.1
src: /usr/lib/aarch64-linux-gnu/tegra/libnvscisync.so, src_lnk: libnvscisync.so.1, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/lib/aarch64-linux-gnu/tegra/libnvscisync.so, dst_lnk: libnvscisync.so.1
src: /usr/share/glvnd/egl_vendor.d/10_nvidia.json, src_lnk: …/…/…/lib/aarch64-linux-gnu/tegra-egl/nvidia.json, dst: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/usr/share/glvnd/egl_vendor.d/10_nvidia.json, dst_lnk: …/…/…/lib/aarch64-linux-gnu/tegra-egl/nvidia.json
, stderr: nvidia-container-cli: mount error: file creation failed: /var/lib/docker/overlay2/b7fd869160ce574be1c00df28d736956513855046afcd5f7efeb0b5e596fbdd4/merged/dev/nvhost-as-gpu: invalid argument: unknown.
~/workspaces/isaac_ros_dev/src/isaac_ros_common
jetson@orin:~/workspaces/isaac_ros_dev/src/isaac_ros_common$

FYI: Orin Jetpack is running on Western Digital 500GB WD Blue SN570 NVMe Internal Solid State Drive SSD - Gen3 x4 PCIe 8Gb/s, M.2 2280, Up to 3,500 MB/s - WDS500G3B0C

Host machine installation without an issue but Jetson AGX Orin installation issues as mentioned above.

Solved by
1 - un-installing the previous versions of nvidia-containers
2 - reinstalling docker
3 - Isaac ros development setup

Just careless mistakes, sorry!

The latest version of the nvidia-container-toolkit v1.10.0 and higher is likely what was key to solving this. Thank you for working through this and journaling your adventure here to help others!

1 Like