Nvidia Jetson Nano: Vision Transformers & Computer Vision: TAM Unable to run on Jetson Nano Developer

Unable to run TAM

jetson-containers run $(autotag tam)

TensorRT version: 10.4.0
-- Building container tam:r36.4.3-onnxruntime

DOCKER_BUILDKIT=0 docker build --network=host --tag tam:r36.4.3-onnxruntime \
--file /home/nataraj/aidata/jetson-containers/packages/ml/onnxruntime/Dockerfile \
--build-arg BASE_IMAGE=tam:r36.4.3-tensorrt \
--build-arg ONNXRUNTIME_VERSION="1.21.0" \
--build-arg ONNXRUNTIME_BRANCH="v1.21.0" \
--build-arg ONNXRUNTIME_FLAGS="--allow_running_as_root" \
/home/nataraj/aidata/jetson-containers/packages/ml/onnxruntime \
2>&1 | tee /home/nataraj/aidata/jetson-containers/logs/20250211_192940/build/tam_r36.4.3-onnxruntime.txt; exit ${PIPESTATUS[0]}

DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
            BuildKit is currently disabled; enable it by removing the DOCKER_BUILDKIT=0
            environment-variable.

Sending build context to Docker daemon  26.11kB
Step 1/5 : ARG BASE_IMAGE
Step 2/5 : FROM ${BASE_IMAGE}
 ---> 03fe53934962
Step 3/5 : ARG ONNXRUNTIME_VERSION     ONNXRUNTIME_BRANCH     ONNXRUNTIME_FLAGS     FORCE_BUILD=off
 ---> Using cache
 ---> 7945bbfa6491
Step 4/5 : COPY install.sh build.sh /tmp/onnxruntime/
 ---> Using cache
 ---> b965393ae9c1
Step 5/5 : RUN /tmp/onnxruntime/install.sh || /tmp/onnxruntime/build.sh
 ---> Running in 99934d55a54a
+ '[' off == on ']'
+ tarpack install onnxruntime-gpu-1.21.0
+ COMMAND=install
+ PACKAGE=onnxruntime-gpu-1.21.0
+ : /usr/local
+ : /tmp/tarpack
+ : '--quiet --show-progress --progress=bar:force:noscroll'
+ mkdir -p /tmp/tarpack/uploads
+ '[' install == install ']'
+ cd /tmp/tarpack
+ wget --quiet --show-progress --progress=bar:force:noscroll https://apt.jetson-ai-lab.dev/jp6/cu126/onnxruntime-gpu-1.21.0.tar.gz
Building onnxruntime 1.21.0 (branch=v1.21.0, flags=--allow_running_as_root)
+ echo 'Building onnxruntime 1.21.0 (branch=v1.21.0, flags=--allow_running_as_root)'
+ pip3 uninstall -y onnxruntime
WARNING: Skipping onnxruntime as it is not installed.
+ git clone https://github.com/microsoft/onnxruntime /opt/onnxruntime
Cloning into '/opt/onnxruntime'...
Updating files: 100% (9106/9106), done.
+ cd /opt/onnxruntime
+ git checkout v1.21.0
error: pathspec 'v1.21.0' did not match any file(s) known to git
The command '/bin/sh -c /tmp/onnxruntime/install.sh || /tmp/onnxruntime/build.sh' returned a non-zero code: 1
Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/home/nataraj/aidata/jetson-containers/jetson_containers/tag.py", line 58, in <module>
    image = find_container(args.packages[0], prefer_sources=args.prefer, disable_sources=args.disable, user=args.user, quiet=args.quiet)
  File "/home/nataraj/aidata/jetson-containers/jetson_containers/container.py", line 537, in find_container
    return build_container('', package) #, simulate=True)
  File "/home/nataraj/aidata/jetson-containers/jetson_containers/container.py", line 147, in build_container
    status = subprocess.run(cmd.replace(_NEWLINE_, ' '), executable='/bin/bash', shell=True, check=True)  
  File "/usr/lib/python3.10/subprocess.py", line 526, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command 'DOCKER_BUILDKIT=0 docker build --network=host --tag tam:r36.4.3-onnxruntime --file /home/nataraj/aidata/jetson-containers/packages/ml/onnxruntime/Dockerfile --build-arg BASE_IMAGE=tam:r36.4.3-tensorrt --build-arg ONNXRUNTIME_VERSION="1.21.0" --build-arg ONNXRUNTIME_BRANCH="v1.21.0" --build-arg ONNXRUNTIME_FLAGS="--allow_running_as_root" /home/nataraj/aidata/jetson-containers/packages/ml/onnxruntime 2>&1 | tee /home/nataraj/aidata/jetson-containers/logs/20250211_192940/build/tam_r36.4.3-onnxruntime.txt; exit ${PIPESTATUS[0]}' returned non-zero exit status 1.
-- Error:  return code 1
V4L2_DEVICES:  --device /dev/video0  --device /dev/video1 
+ docker run --runtime nvidia -it --rm --network host --shm-size=8g --volume /tmp/argus_socket:/tmp/argus_socket --volume /etc/enctune.conf:/etc/enctune.conf --volume /etc/nv_tegra_release:/etc/nv_tegra_release --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model --volume /var/run/dbus:/var/run/dbus --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket --volume /var/run/docker.sock:/var/run/docker.sock --volume /home/nataraj/aidata/data:/data -v /etc/localtime:/etc/localtime:ro -v /etc/timezone:/etc/timezone:ro --device /dev/snd -e PULSE_SERVER=unix:/run/user/1000/pulse/native -v /run/user/1000/pulse:/run/user/1000/pulse --device /dev/bus/usb --device /dev/video0 --device /dev/video1 --device /dev/i2c-0 --device /dev/i2c-1 --device /dev/i2c-2 --device /dev/i2c-4 --device /dev/i2c-5 --device /dev/i2c-7 --device /dev/i2c-9 --name jetson_container_20250211_193611
"docker run" requires at least 1 argument.
See 'docker run --help'.

Usage:  docker run [OPTIONS] IMAGE [COMMAND] [ARG...]

Create and run a new container from an image

I tried to build it and it’s the same issue jetson-containers build tam

TensorRT version: 10.4.0
-- Building container tam:r36.4.3-onnxruntime

DOCKER_BUILDKIT=0 docker build --network=host --tag tam:r36.4.3-onnxruntime \
--file /home/nataraj/aidata/jetson-containers/packages/ml/onnxruntime/Dockerfile \
--build-arg BASE_IMAGE=tam:r36.4.3-tensorrt \
--build-arg ONNXRUNTIME_VERSION="1.21.0" \
--build-arg ONNXRUNTIME_BRANCH="v1.21.0" \
--build-arg ONNXRUNTIME_FLAGS="--allow_running_as_root" \
/home/nataraj/aidata/jetson-containers/packages/ml/onnxruntime \
2>&1 | tee /home/nataraj/aidata/jetson-containers/logs/20250211_192940/build/tam_r36.4.3-onnxruntime.txt; exit ${PIPESTATUS[0]}

DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
            BuildKit is currently disabled; enable it by removing the DOCKER_BUILDKIT=0
            environment-variable.

Sending build context to Docker daemon  26.11kB
Step 1/5 : ARG BASE_IMAGE
Step 2/5 : FROM ${BASE_IMAGE}
 ---> 03fe53934962
Step 3/5 : ARG ONNXRUNTIME_VERSION     ONNXRUNTIME_BRANCH     ONNXRUNTIME_FLAGS     FORCE_BUILD=off
 ---> Using cache
 ---> 7945bbfa6491
Step 4/5 : COPY install.sh build.sh /tmp/onnxruntime/
 ---> Using cache
 ---> b965393ae9c1
Step 5/5 : RUN /tmp/onnxruntime/install.sh || /tmp/onnxruntime/build.sh
 ---> Running in 99934d55a54a
+ '[' off == on ']'
+ tarpack install onnxruntime-gpu-1.21.0
+ COMMAND=install
+ PACKAGE=onnxruntime-gpu-1.21.0
+ : /usr/local
+ : /tmp/tarpack
+ : '--quiet --show-progress --progress=bar:force:noscroll'
+ mkdir -p /tmp/tarpack/uploads
+ '[' install == install ']'
+ cd /tmp/tarpack
+ wget --quiet --show-progress --progress=bar:force:noscroll https://apt.jetson-ai-lab.dev/jp6/cu126/onnxruntime-gpu-1.21.0.tar.gz
Building onnxruntime 1.21.0 (branch=v1.21.0, flags=--allow_running_as_root)
+ echo 'Building onnxruntime 1.21.0 (branch=v1.21.0, flags=--allow_running_as_root)'
+ pip3 uninstall -y onnxruntime
WARNING: Skipping onnxruntime as it is not installed.
+ git clone https://github.com/microsoft/onnxruntime /opt/onnxruntime
Cloning into '/opt/onnxruntime'...
Updating files: 100% (9106/9106), done.
+ cd /opt/onnxruntime
+ git checkout v1.21.0
error: pathspec 'v1.21.0' did not match any file(s) known to git
The command '/bin/sh -c /tmp/onnxruntime/install.sh || /tmp/onnxruntime/build.sh' returned a non-zero code: 1
Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/home/nataraj/aidata/jetson-containers/jetson_containers/tag.py", line 58, in <module>
    image = find_container(args.packages[0], prefer_sources=args.prefer, disable_sources=args.disable, user=args.user, quiet=args.quiet)
  File "/home/nataraj/aidata/jetson-containers/jetson_containers/container.py", line 537, in find_container
    return build_container('', package) #, simulate=True)
  File "/home/nataraj/aidata/jetson-containers/jetson_containers/container.py", line 147, in build_container
    status = subprocess.run(cmd.replace(_NEWLINE_, ' '), executable='/bin/bash', shell=True, check=True)  
  File "/usr/lib/python3.10/subprocess.py", line 526, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command 'DOCKER_BUILDKIT=0 docker build --network=host --tag tam:r36.4.3-onnxruntime --file /home/nataraj/aidata/jetson-containers/packages/ml/onnxruntime/Dockerfile --build-arg BASE_IMAGE=tam:r36.4.3-tensorrt --build-arg ONNXRUNTIME_VERSION="1.21.0" --build-arg ONNXRUNTIME_BRANCH="v1.21.0" --build-arg ONNXRUNTIME_FLAGS="--allow_running_as_root" /home/nataraj/aidata/jetson-containers/packages/ml/onnxruntime 2>&1 | tee /home/nataraj/aidata/jetson-containers/logs/20250211_192940/build/tam_r36.4.3-onnxruntime.txt; exit ${PIPESTATUS[0]}' returned non-zero exit status 1.
-- Error:  return code 1
V4L2_DEVICES:  --device /dev/video0  --device /dev/video1 
+ docker run --runtime nvidia -it --rm --network host --shm-size=8g --volume /tmp/argus_socket:/tmp/argus_socket --volume /etc/enctune.conf:/etc/enctune.conf --volume /etc/nv_tegra_release:/etc/nv_tegra_release --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model --volume /var/run/dbus:/var/run/dbus --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket --volume /var/run/docker.sock:/var/run/docker.sock --volume /home/nataraj/aidata/data:/data -v /etc/localtime:/etc/localtime:ro -v /etc/timezone:/etc/timezone:ro --device /dev/snd -e PULSE_SERVER=unix:/run/user/1000/pulse/native -v /run/user/1000/pulse:/run/user/1000/pulse --device /dev/bus/usb --device /dev/video0 --device /dev/video1 --device /dev/i2c-0 --device /dev/i2c-1 --device /dev/i2c-2 --device /dev/i2c-4 --device /dev/i2c-5 --device /dev/i2c-7 --device /dev/i2c-9 --name jetson_container_20250211_193611
"docker run" requires at least 1 argument.
See 'docker run --help'.

Usage:  docker run [OPTIONS] IMAGE [COMMAND] [ARG...]

Create and run a new container from an image

and when i was trying to run the specific version

jetson-containers run dustynv/tam:r35.4.1

jetson-containers run dustynv/tam:r35.4.1
V4L2_DEVICES:  --device /dev/video0  --device /dev/video1 
+ docker run --runtime nvidia -it --rm --network host --shm-size=8g --volume /tmp/argus_socket:/tmp/argus_socket --volume /etc/enctune.conf:/etc/enctune.conf --volume /etc/nv_tegra_release:/etc/nv_tegra_release --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model --volume /var/run/dbus:/var/run/dbus --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket --volume /var/run/docker.sock:/var/run/docker.sock --volume /home/nataraj/aidata/data:/data -v /etc/localtime:/etc/localtime:ro -v /etc/timezone:/etc/timezone:ro --device /dev/snd -e PULSE_SERVER=unix:/run/user/1000/pulse/native -v /run/user/1000/pulse:/run/user/1000/pulse --device /dev/bus/usb --device /dev/video0 --device /dev/video1 --device /dev/i2c-0 --device /dev/i2c-1 --device /dev/i2c-2 --device /dev/i2c-4 --device /dev/i2c-5 --device /dev/i2c-7 --device /dev/i2c-9 --name jetson_container_20250211_195513 dustynv/tam:r35.4.1
/usr/local/lib/python3.8/dist-packages/gradio_client/documentation.py:102: UserWarning: Could not get documentation group for <class 'gradio.mix.Parallel'>: No known documentation group for module 'gradio.mix'
  warnings.warn(f"Could not get documentation group for {cls}: {exc}")
/usr/local/lib/python3.8/dist-packages/gradio_client/documentation.py:102: UserWarning: Could not get documentation group for <class 'gradio.mix.Series'>: No known documentation group for module 'gradio.mix'
  warnings.warn(f"Could not get documentation group for {cls}: {exc}")
Traceback (most recent call last):
  File "app.py", line 4, in <module>
    import cv2
  File "/usr/local/lib/python3.8/dist-packages/cv2/__init__.py", line 96, in <module>
    bootstrap()
  File "/usr/local/lib/python3.8/dist-packages/cv2/__init__.py", line 86, in bootstrap
    import cv2
ImportError: libffi.so.8: cannot open shared object file: No such file or directory

Your previous issue are on Orin Nano, but now are on Jetson Nano.
Are you using Jetson Nano or Orin Nano devkit?

jetson orin nano developer kit

Moving over to Orin Nano forum.

Hi,

DOCKER_BUILDKIT=0 docker build --network=host --tag tam:r36.4.3-onnxruntime \
...
--build-arg ONNXRUNTIME_VERSION="1.21.0" \
--build-arg ONNXRUNTIME_BRANCH="v1.21.0" \

We have restored the ONNXRuntime version to 1.20.1.
Please make sure you have the change below.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.