JetPack 6.0 Production Release - Announcement

We are pleased to announce the production release of JetPack 6.0. JetPack 6 packages Jetson Linux 36.3 with Linux Kernel 5.15 and Ubuntu 22.04 based root file system. The Jetson AI stack in JetPack 6 includes CUDA 12.2, cuDNN 8.9, TensorRT 8.6, and VPI 3.1. This release supports all NVIDIA Jetson Orin modules and developer kits.

Highlights of JetPack 6:

  • Production ready software stack
  • Preview of upgradable compute stack to upgrade CUDA, Tensorrt, cuDNN, DLA and VPI without upgrading Jetson Linux. Refer to JetPack documentation for running compute stack from this production release on JetPack 6 Developer Preview.
  • Pre-built Kernel debian package with PREMPT_RT enabled.
  • Support for upgrading Jetson Orin Nano Developer Kit from JetPack 5 to JetPack 6 without needing a host machine.
  • Over-The-Air (OTA) Update
  • Security
    • Enabled support for security features, bringing parity with JetPack 5.
  • Power And Performance
    • Enabled support for power features, bringing parity with JetPack 5.
    • Power Estimator supported for JetPack 6.
  • Display
    • Support for Framebuffer console
    • Support for suspend/resume in Display Core Engine (DCE)
  • Multimedia
    • Support for H264 Constrained_Baseline and Constrained_High profiles
    • Dynamic Bitrate Rate, Dynamic FrameRate support for AV1 encoder
    • Dynamic Resolution Change[DRC] support for H264, H265 and AV1 encoder
    • UYVY BT.709 and BT.2020 colorimetry support for video transform

JetPack 6 now includes Jetson Platform Services: A collection of ready to use services to accelerate AI application development on Jetson. Read more about it in the JetPack product page. These services will be made available on Jetpack 6 soon (May end). These services can be easily installed via debian package or via SDK Manager when available.

Installation:

You can install JetPack 6 with any of the methods below:

  • SDK Manager: You can do a fresh install of JetPack 6 using SDK Manager. It supports installing JetPack 6 to emmc (for Jetson AGX Orin), or nvme SSD.
  • Debian Package: If you have JetPack 6 Developer Preview already installed on Jetson AGX Orin Developer Kit or Jetson Orin Nano Developer Kit, you can upgrade to JetPack 6 using APT. Refer to the steps here
  • SD Card: If you are using Jetson Orin Nano Developer Kit, you can download the SD Card image from JetPack 6 page and use Balena Etcher to prepare the SD Card with JetPack 6.
  • Manual Flashing: If you prefer to install using command line, you can flash Jetson device from a linux host by following steps here. Once Jetson Linux is flashed, you can install the compute stack using SDK Manager (using linux host) or by running “sudo apt update” followed by “sudo apt install nvidia-jetpack” on Jetson.

For Jetson Orin Nano Developer Kit Users:

If you are using JetPack 5 currently on Jetson Orin Nano Developer Kit and want to upgrade to JetPack 6 without using a Linux host. Please follow the instructions here.

JetPack 6.0 Components:

  • Jetson Linux 36.3
  • CUDA 12.2
  • TensorRT 8.6.2
  • cuDNN 8.9.4
  • VPI 3.1
  • Vulkan 1.3
  • Nsight Systems 2024.2
  • Nsight Graphics 2023.4

SDK Support:

  • DeepStream 7.0 (coming in May)
  • Isaac ROS 3.0 (coming in May)
  • Holoscan 2.0

Containers:

Rest of the containers will be made live soon.l

JetPack 6 Resources:

Quick Reference for manual flashing commands:

  1. Jetson AGX Orin Industrial - EMMC used for rootfs (using Jetson AGX Orin Developer Kit carrier board):
sudo ./flash.sh jetson-agx-orin-devkit-industrial internal
  1. Jetson AGX Orin Industrial - NVMe used for rootfs (using Jetson AGX Orin Developer Kit carrier board):
sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_external.xml --showlogs --network usb0 jetson-agx-orin-devkit-industrial internal
  1. Jetson AGX Orin Developer Kit - EMMC used for rootfs:
sudo ./flash.sh jetson-agx-orin-devkit internal
  1. Jetson AGX Orin Developer Kit - NVME used for rootfs:
sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_external.xml --showlogs --network usb0 jetson-agx-orin-devkit internal
  1. Jetson Orin NX 16 GB - NVMe used for rootfs (using Jetson Orin Nano Developer Kit carrier board):
sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_t234_nvme.xml -p '-c bootloader/generic/cfg/flash_t234_qspi.xml' --showlogs --network usb0 jetson-orin-nano-devkit internal
  1. Jetson Orin NX 8 GB - NVMe used for rootfs ((using Jetson Orin Nano Developer Kit carrier board):
sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_t234_nvme.xml -p '-c bootloader/generic/cfg/flash_t234_qspi.xml' --showlogs --network usb0 jetson-orin-nano-devkit internal
  1. Jetson Orin Nano Developer Kit- NVMe used for rootfs:
sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_t234_nvme.xml -p '-c bootloader/generic/cfg/flash_t234_qspi.xml' --showlogs --network usb0 jetson-orin-nano-devkit internal
  1. Jetson Orin Nano 4 GB - NVMe used for rootfs (using Jetson Orin Nano Developer Kit carrier board):
sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_t234_nvme.xml -p '-c bootloader/generic/cfg/flash_t234_qspi.xml' --showlogs --network usb0 jetson-orin-nano-devkit internal
  1. Jetson Orin Nano Developer Kit - SDCard used for rootfs (using Jetson Orin Nano Developer Kit carrier board):
sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device mmcblk0p1 -c tools/kernel_flash/flash_l4t_t234_nvme.xml -p '-c bootloader/generic/cfg/flash_t234_qspi.xml' --showlogs --network usb0 jetson-orin-nano-devkit internal
2 Likes

Jetson containers are now available on NGC:

Jetpack: NVIDIA L4T JetPack | NVIDIA NGC

CUDA: https://catalog.ngc.nvidia.com/orgs/nvidia/containers/l4t-cuda (since CUDA versions did not change with the production release. The container with tag starting with “12.2.12” can be used)

Tensorrt: NVIDIA L4T TensorRT | NVIDIA NGC (since Tensorrt versions did not change with the production release. The container with tag starting with “8.6,2” can be used)

Tensorflow: TensorFlow | NVIDIA NGC (use tag starting with “24,04”)

PyTorch: PyTorch | NVIDIA NGC (use tag starting with “24,04”)

Cross Compilation Container: JetPack Cross Compilation container | NVIDIA NGC

Jetson Linux Flash Container: Jetson Linux Flash container | NVIDIA NGC

Announcement: There will be no more releases of L4T base container since as we evolved our containers, L4t base container was mostly just an Ubuntu root file system. So going forward, please use CUDA or Tensorrt container as base containers.