Advice on getting started with the Jetson Orin Nano

Hi there Jetson Community,

I am a newbie to the Jetson family having just purchased a Jetson Orin Nano. This is my first time working with an arm64 system and I need some clarification. I have gone through the setup and I am working through the Hello AI tutorial. I notice that there seems to be two main repos

  1. I am trying to understand the relationship between these two repos, and how that would influence my own future workflow. It seems that jetson-inference relies on a container built from jetson-containers. However jetson-inference also contains the option to build from source. So then why does it rely on jetson-containers?

  2. jetson-containers also seems to contain more and newer content than jetson-inference, and also has tutorials for inference. So I am quite confused why jetson-inference is needed after all, and any explanation would be helpful or links to a something explaining this relationship will be helpful.

  3. In my own work, I utilize Docker images heavily. Is this also the preferred way to develop with the Jetson devices?

  4. If I’m using Docker images for my workflow, I am assuming I should start with the ones from jetson-containers. Have these packages been specifically modified for the Jetson devices?

  5. If I want to run some code from another repo such as https://github.com/Megvii-BaseDetection/YOLOX, will this run out-the-box, or do I need to modify it somehow to work with the Jetson Orin Nano.

I know it a lot of questions, and some may seem noob-ish, but any advice/clarification/assistance would be appreciated. Thanks in advance.

Hi,

jetson-inference is deprecated since the repo owner has stepped down.

It’s recommended to try our new tutorial below:

The tutorial depends on the container from jetson-containers.
For YOLO, you can check the tutorial below that uses the Ultralytics library:

Thanks.

Thanks for that clarification. However, jetson-inference is referenced in jetson-containers under the jetson-containers/packages/cv/jetson-inference. When I try to build this Docker from jetson-containers with the command jetson-contaienrs run $(autotag jetson-inferece), it gives an error related to a numpy version incompatibilty:

[00:31:26] [25/25] Testing jetson-inference (jetson-inference:r36.4.tegra-aarch64-cu126-22.04-jetson-inference)24 stages completed in 32m44s at 00:31:26 
[25/25] Testing jetson-inference (jetson-inference:r36.4.tegra-aarch64-cu126-22.04-jetson-inference)24 stages completed in 32m44s at 00:31:26                                                                                                                                      
┌──────────────────────────────────────────────────────────────────────────────┐
│ > TESTING  jetson-inference:r36.4.tegra-aarch64-cu126-22.04-jetson-inference │
└──────────────────────────────────────────────────────────────────────────────┘

docker run -t --rm --network=host --privileged --runtime=nvidia \
  --volume /ssd/repos/jetson-containers/packages/cv/jetson-inference:/test \
  --volume /ssd/repos/jetson-containers/data:/data \
  jetson-inference:r36.4.tegra-aarch64-cu126-22.04-jetson-inference \
    /bin/bash -c '/bin/bash /test/test_utils.sh


testing jetson-utils...
--2025-09-05 07:31:28--  https://raw.githubusercontent.com/dusty-nv/jetson-inference/master/data/images/granny_smith_1.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.108.133, 185.199.110.133, 185.199.111.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.108.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 547142 (534K) [image/jpeg]
Saving to: ‘/tmp/apple.jpg’

/tmp/apple.jpg      100%[===================>] 534.32K  3.36MB/s    in 0.2s    

2025-09-05 07:31:29 (3.36 MB/s) - ‘/tmp/apple.jpg’ saved [547142/547142]


A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.2.6 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.

If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.

Traceback (most recent call last):  File "/opt/jetson-inference/build/aarch64/bin/cuda-examples.py", line 26, in <module>
    from jetson_utils import (cudaAllocMapped, cudaConvertColor, cudaCrop,
  File "/usr/lib/python3.10/dist-packages/jetson_utils/__init__.py", line 4, in <module>
    from jetson_utils_python import *
AttributeError: _ARRAY_API not found
Traceback (most recent call last):
  File "/opt/jetson-inference/build/aarch64/bin/cuda-examples.py", line 26, in <module>
    from jetson_utils import (cudaAllocMapped, cudaConvertColor, cudaCrop,
  File "/usr/lib/python3.10/dist-packages/jetson_utils/__init__.py", line 4, in <module>
    from jetson_utils_python import *
SystemError: initialization of jetson_utils_python raised unreported exception
[00:31:29] ===================================================================================== 
[00:31:29] ===================================================================================== 
[00:31:29] `jetson-containers build` failed after 1969.2 seconds (32.8 minutes) 
[00:31:29] Error: Command 'docker run -t --rm --network=host --privileged --runtime=nvidia   --volume /ssd/repos/jetson-containers/packages/cv/jetson-inference:/test   --volume /ssd/repos/jetson-containers/data:/data   jetson-inference:r36.4.tegra-aarch64-cu126-22.04-jetson-inference     /bin/bash -c '/bin/bash /test/test_utils.sh' 2>&1 | tee /ssd/repos/jetson-containers/logs/20250904_235834/test/25-1_jetson-inference_r36.4.tegra-aarch64-cu126-22.04-jetson-inference_test_utils.sh.txt; exit ${PIPESTATUS[0]}' returned non-zero exit status 1. 
[00:31:29] ===================================================================================== 
[00:31:29] ===================================================================================== 
Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/ssd/repos/jetson-containers/jetson_containers/tag.py", line 58, in <module>
    image = find_container(args.packages[0], prefer_sources=args.prefer, disable_sources=args.disable, user=args.user, quiet=args.quiet)
  File "/ssd/repos/jetson-containers/jetson_containers/container.py", line 668, in find_container
    return build_container('', package) #, simulate=True)
  File "/ssd/repos/jetson-containers/jetson_containers/container.py", line 246, in build_container
    test_container(container_name, pkg, simulate, build_idx=idx)
  File "/ssd/repos/jetson-containers/jetson_containers/container.py", line 456, in test_container
    status = subprocess.run(cmd.replace(_NEWLINE_, ' '), executable='/bin/bash', shell=True, check=True)
  File "/usr/lib/python3.10/subprocess.py", line 526, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command 'docker run -t --rm --network=host --privileged --runtime=nvidia   --volume /ssd/repos/jetson-containers/packages/cv/jetson-inference:/test   --volume /ssd/repos/jetson-containers/data:/data   jetson-inference:r36.4.tegra-aarch64-cu126-22.04-jetson-inference     /bin/bash -c '/bin/bash /test/test_utils.sh' 2>&1 | tee /ssd/repos/jetson-containers/logs/20250904_235834/test/25-1_jetson-inference_r36.4.tegra-aarch64-cu126-22.04-jetson-inference_test_utils.sh.txt; exit ${PIPESTATUS[0]}' returned non-zero exit status 1.
-- Error:  return code 1
V4L2_DEVICES:  --device /dev/video0  --device /dev/video1  --device /dev/video2  --device /dev/video3  --device /dev/video4  --device /dev/video5  --device /dev/video6  --device /dev/video7 
### DISPLAY environmental variable is already set: ":1"
localuser:root being added to access control list
### ARM64 architecture detected
### Jetson Detected
SYSTEM_ARCH=tegra-aarch64
+ docker run --runtime nvidia --env NVIDIA_DRIVER_CAPABILITIES=compute,utility,graphics -it --rm --network host --shm-size=8g --volume /tmp/argus_socket:/tmp/argus_socket --volume /etc/enctune.conf:/etc/enctune.conf --volume /etc/nv_tegra_release:/etc/nv_tegra_release --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model --volume /var/run/dbus:/var/run/dbus --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket --volume /var/run/docker.sock:/var/run/docker.sock --volume /ssd/repos/jetson-containers/data:/data -v /etc/localtime:/etc/localtime:ro -v /etc/timezone:/etc/timezone:ro --device /dev/snd -e PULSE_SERVER=unix:/run/user/1000/pulse/native -v /run/user/1000/pulse:/run/user/1000/pulse --device /dev/bus/usb -e DISPLAY=:1 -v /tmp/.X11-unix/:/tmp/.X11-unix -v /tmp/.docker.xauth:/tmp/.docker.xauth -e XAUTHORITY=/tmp/.docker.xauth --device /dev/video0 --device /dev/video1 --device /dev/video2 --device /dev/video3 --device /dev/video4 --device /dev/video5 --device /dev/video6 --device /dev/video7 --device /dev/i2c-0 --device /dev/i2c-1 --device /dev/i2c-2 --device /dev/i2c-4 --device /dev/i2c-5 --device /dev/i2c-7 --device /dev/i2c-9 --name jetson_container_20250905_003130
docker: 'docker run' requires at least 1 argument

Usage:  docker run [OPTIONS] IMAGE [COMMAND] [ARG...]

See 'docker run --help' for more information

Hi,

jetson-container is the source for building different Docker containers for the various Jetson use cases.

One of the containers is jetson-inference, which depends on the JetPack 6.0.

For the latest JetPack, we recommend Jetson AI Lab, which has up-to-date info and should work better with the latest JetPack:

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.