JetPack 6.0 Developer Preview - Release Announcement

We are very pleased to announce the release of JetPack 6.0 Developer Preview (DP). JetPack 6 comes with an updated LTS Kernel 5.15 and Ubuntu 22.04 based root file system. JetPack 6 also brings in new capabilities not possible before on the NVIDIA Jetson platform for edge AI and robotics, including:

  • Flexibility to Bring Your Own Kernel: Our commitment to upstreaming Jetson changes to upstream Linux Kernel has now made it possible to bring the latest Linux Kernel on Jetson. We have provided a recipe to bring your own Kernel in our Jetson Linux Documentation. Stay tuned for further enhancements to the recipe that will make it even simpler to use moving forward.
  • Greater Choices of Linux Based Distros: Our upstreaming effort has also now enabled other Linux-based distributions to offer their operating systems on Jetson. Though Jetson Linux is the out-of-the-box Ubuntu-based distro packaged in JetPack, a number of Linux-based distributions are available from our partners. Please check the JetPack 6 Developer Preview page for details.
  • Upgradable Compute Stack: We are targeting new capabilities that will provide the flexibility to upgrade the AI compute stack and bring the latest versions without upgrading the Jetson Linux BSP. Planned for release in March 2024, this will allow AI developers to upgrade to the latest compute stack without needing to upgrade the whole JetPack. Along with the above novel features, JetPack 6 Developer Preview also packages NVIDIA Cuda 12.2,TensorRT 8.6,2, cuDNN 8.9.4 and VPI 3.0.

Visit the JetPack 6 Developer Preview page and Jetson Linux page to learn how to easily install JetPack 6.

Please Note that OTA features are not supported in this release. Please do not try to apt upgrade to JetPack 6 Developer Preview release

Please note that the JetPack 6.0 DP is a developer preview release and not intended for production. It is ready for starting your development on Jetson Orin with JetPack 6 software stack. This release does not include any security features or OTA features. The production quality release of JetPack 6.0 is targeted for March 2024 and will include security and OTA features along with production quality stack. Please carefully read the release notes of JetPack 6.0 Developer Preview for known issues and other details.

JetPack 6 Components:

  • Jetson Linux 36.2
  • CUDA 12.2
  • TensorRT 8.6.2
  • cuDNN 8.9.4
  • VPI 3.0
  • Vulkan 1.3
  • Nsight Systems 2023.4
  • Nsight Graphics 2023.3

JetPack 6 Resources:

4 Likes

Jetson containers are live now:

L4T Base: NVIDIA L4T Base | NVIDIA NGC
CUDA (Runtime and Devel) : NVIDIA L4T CUDA | NVIDIA NGC
Tensorrt (Runtime and Devel): https://catalog.ngc.nvidia.com/orgs/nvidia/containers/l4t-tensorrt
Tensorflow: TensorFlow | NVIDIA NGC (use the tag: 23.11-tf2-py3-igpu)
Pytorch: PyTorch | NVIDIA NGC (use the tag: 23.11-py3-igpu)
JetPack: NVIDIA L4T JetPack | NVIDIA NGC

Will be there an update of out-of-trees drivers for compile kernel +6.0 before final release March?

@dusty_nv I receive update from apt jetpack 6 without problems in installation.
Some updates to kernel etc.
Is out of tree modules updated? I found a lot of erros to build kernel 6.6

Build errors with the latest Linux kernels are common and we are continuously working to fix these errors. However, there will always be a lag between new Linux kernels being released and the updates to the NVIDIA drivers being rolled out. The good news is that we have been working to fix the errors you encountered with v6.6 (and later kernels) and the latest drivers are now available on public git repositories. I have responded to the above thread with more details.

1 Like

Thank you so much.
Ubuntu 24.04 is coming with kernel 6.8.

So, thank you for your effort, I hope to update jetson in May with new canonical 24.04 version.
Now, I will try to compile 6.6 or 6.7.

Hi Suhash, I’m new in using the Jetson AGX Orin SDK. If we have several Vision AI models to try out, do you recommend use Jetson containers or Ubuntu virtual environments to isolate the projects?

How to build a Jetson Container?

Thanks,
Feng

Depends, but docker it’s the cleaner

Thanks! Did you mean that docker container is cleaner?

Are the virtual environments compatible with the Jetson GPU cores?

Thanks for the release! I’ve got Jetpack 6 working on my Jetson Orin Nano, but when I go to run one of the Docker examples, it doesn’t seem as though the docker daemon is running. Is this expected? Is there a recommended way to install Docker with GPU acceleration on Jetpack 6 yet?

@burningion the SD card image should already come with Docker installed and pre-configured with the nvidia runtime. Or if you flashed with SDK Manager, then SDK Manager will install it during the post-install steps after it installs CUDA Toolkit, cuDNN, ect.

If you run sudo docker info, do you see nvidia listed under the runtimes? Which container are you trying to run?

$ sudo docker info
Client: Docker Engine - Community
 Version:    24.0.7
 Context:    default
 Debug Mode: false
 Plugins:
  buildx: Docker Buildx (Docker Inc.)
    Version:  v0.11.2
    Path:     /usr/libexec/docker/cli-plugins/docker-buildx
  compose: Docker Compose (Docker Inc.)
    Version:  v2.21.0
    Path:     /usr/libexec/docker/cli-plugins/docker-compose

Server:
 Containers: 13
  Running: 1
  Paused: 0
  Stopped: 12
 Images: 3084
 Server Version: 24.0.7
 Storage Driver: overlay2
  Backing Filesystem: extfs
  Supports d_type: true
  Using metacopy: false
  Native Overlay Diff: true
  userxattr: false
 Logging Driver: json-file
 Cgroup Driver: systemd
 Cgroup Version: 2
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
 Swarm: inactive
 Runtimes: io.containerd.runc.v2 nvidia runc
 Default Runtime: nvidia
 Init Binary: docker-init
 containerd version: d8f198a4ed8892c764191ef7b3b06d8a2eeb5c7f
 runc version: v1.1.10-0-g18a0cb0
 init version: de40ad0
 Security Options:
  seccomp
   Profile: builtin
  cgroupns
 Kernel Version: 5.15.122-tegra
 Operating System: Ubuntu 22.04.3 LTS
 OSType: linux
 Architecture: aarch64
 CPUs: 12
 Total Memory: 61.37GiB
 Name: jao-60
 ID: ccf2066a-b97e-40a8-b4a1-b56147c3dadb
 Docker Root Dir: /mnt/NVME/docker/data/JetPack_6.0_DP_b32/data
 Debug Mode: false
 Username: dustynv
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Live Restore Enabled: false

Hey Dusty, thanks for the quick response. Here’s what I get:

Client: Docker Engine - Community
 Version:    25.0.2
 Context:    default
 Debug Mode: false
 Plugins:
  buildx: Docker Buildx (Docker Inc.)
    Version:  v0.12.1
    Path:     /usr/libexec/docker/cli-plugins/docker-buildx
  compose: Docker Compose (Docker Inc.)
    Version:  v2.24.5
    Path:     /usr/libexec/docker/cli-plugins/docker-compose

Server:
 Containers: 0
  Running: 0
  Paused: 0
  Stopped: 0
 Images: 0
 Server Version: 25.0.2
 Storage Driver: overlay2
  Backing Filesystem: extfs
  Supports d_type: true
  Using metacopy: false
  Native Overlay Diff: true
  userxattr: false
 Logging Driver: json-file
 Cgroup Driver: systemd
 Cgroup Version: 2
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local splunk syslog
 Swarm: inactive
 Runtimes: io.containerd.runc.v2 nvidia runc
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: ae07eda36dd25f8a1b98dfbf587313b99c0190bb
 runc version: v1.1.12-0-g51d5e94
 init version: de40ad0
 Security Options:
  seccomp
   Profile: builtin
  cgroupns
 Kernel Version: 5.15.122-tegra
 Operating System: Ubuntu 22.04.3 LTS
 OSType: linux
 Architecture: aarch64
 CPUs: 6
 Total Memory: 7.442GiB
 Name: ubuntu
 ID: 92b6f3da-b2f2-43b7-a47d-9963a9f0242a
 Docker Root Dir: /var/lib/docker
 Debug Mode: false
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Live Restore Enabled: false

I’m trying to build the NanoOWL container (GitHub - NVIDIA-AI-IOT/nanoowl: A project that optimizes OWL-ViT for real-time inference with NVIDIA TensorRT.). It seems I need to have the specific commit version of PyTorch to run it from the container tag 23-01.

If it helps, I installed with the Jetpack process on a Linux desktop machine, onto a Jetson mounted NVMe SSD.

Hey Dusty, I think I found it, I think by default (?) the docker socket might require root perms, and the default user created doesn’t have perms. A simple sudo fixes it, thanks for the quick :rubber_duck:!

OK, great @burningion. You can also add your user to the docker group so that you don’t need sudo:

(there are other tidbits in there that you might find helpful in setting up your Jetson)

Also, I have prebuilt JetPack 6 container images for NanoOWL here:

Amazing, thanks so much for all the work and quick response Dusty.

1 Like

I would like to install Jetpack 6 on my AGX Orin, but I cannot get it into recovery mode. I hold the recovery (middle) button while I press and release the reset (right) button. The screen goes blanck for a cuuple of minutes, and then shows a normal boot screen. I connected with a PC with Ubuntu, but lsusb fails to detect the AGX Orin.

Is there something else I should be doing? Do I need to connect a jumper or anything like that? Is there supposed to be s screnn display showing that it is in recovery mode?

Unplug from power supply.
Connects via usb-c to the host computer.
Press and hold the middle button while plugging it into the power supply.
run the nvidia sdk manager

When JetPack 6.0 stable release?

Hi,
Is there a way to install the Jetpack 6 using a Macbook M1 as host?
The sdkmanager fail to install in the Ubuntu VM even with Rosetta 2.
Any help?
Best,