JetPack 6.2 Brings Super Mode to NVIDIA Jetson Orin Nano and Jetson Orin NX Modules

We are pleased to announce the production release of JetPack 6.2. JetPack 6.2 packages Jetson Linux 36.4.3 with Linux Kernel 5.15 and Ubuntu 22.04 based root file system. Jetpack 6.2 brings Super Mode to NVIDIA Jetson Orin Nano and Jetson Orin NX modules. With Super Mode, the Jetson Orin NX series achieves up to a 70% increase in AI TOPS, while the Jetson Orin Nano series delivers comparable AI TOPS improvements alongside a 50% boost in memory bandwidth. The improved performance delivers up to 2x higher generative AI inference performance on Jetson Orin modules. Learn more about performance boots with JetPack 6.2 in our latest blog.

Highlights of JetPack 6.2:

  • Support for new reference power modes for Jetson Orin Nano and Jetson Orin NX production modules delivering up to 2x generative AI performance (available with new flashing configuration)
    • NVIDIA Jetson Orin Nano 4GB: Now supports 10W, 25W and MAXN SUPER
    • NVIDIA Jetson Orin Nano 8GB: Now supports 15W, 25W and MAXN SUPER
    • NVIDIA Jetson Orin NX 8GB: Now supports 10W, 15W, 20W, 40W and MAXN
    • NVIDIA Jetson Orin NX 16GB: Now supports 10W, 15W, 25W, 40W and MAXN
  • CVE and minor bug fixes

NOTE1: The MAXN SUPER is an uncapped power mode that allows the highest number of cores and clock frequency for CPU, GPU, DLA, PVA, and SOC engines. If the total module power exceeds the Thermal Design Power (TDP) budget in these modes, the module is throttled to lower frequency, delivering lower performance while staying within the thermal budget. It is strongly recommended to build your own custom power mode to find the right balance between power consumption (or thermal stability) and performance for your application and needs.
NOTE2: The Jetson Orin NX module should not be used in Super Mode with the Jetson Orin Nano Developer Kit, as the kit is not designed to handle the thermal requirements of this configuration.

Installing & enabling Super Mode on JetPack 6.2:
You can install JetPack 6.2 with any of the methods below:

  • SDK Manager: You can do a fresh install of JetPack 6.2 using SDK Manager.
    Note that SDK Manager supports default Super Mode flashing configuration only for Jetson Orin Nano developer kit.
  • SD Card: If you are using Jetson Orin Nano Developer Kit, follow the instructions below:
    • For Jetson Orin Nano Developer Kit currently running JetPack 6.x: You can download the SD Card image from the JetPack SDK page and use Balena Etcher to prepare the SD Card with JetPack 6.2. Follow the instructions provided in the Getting Started Guide.
    • For other Jetson Orin Nano Developer Kit (fresh unboxed unit or existing unit currently running JetPack 5.x): The factory-installed firmware on the Jetson Orin Developer Kit supports JetPack 5.x and requires an update to ensure compatibility with JetPack 6.x. Update the firmware by following the instructions in the Initial Setup Guide for Jetson Orin Nano Developer Kit
      before installing JetPack 6.2
  • Debian Package: If you have JetPack 6 already installed on Jetson AGX Orin Developer Kit or Jetson Orin Nano Developer Kit, you can upgrade to JetPack 6.2 using APT. Refer to the upgrade steps in the NVIDIA JetPack SDK developer guide
    • Upgrading from JetPack 6.0/6.1 to JetPack 6.2 using this method will not enable Super Mode, as the source build does not support Super Mode.
    • Upgrading from JetPack 6.1 Rev1, which supports Super Mode to JetPack 6.2, will enable Super Mode on the Jetson Orin Nano Developer Kit.
  • Manual Flashing: If you prefer to install using the command line, you can flash Jetson device from a linux host by following steps here. Once Jetson Linux is flashed, you can install the compute stack using SDK Manager (using linux host) or by running “sudo apt update” followed by “sudo apt install nvidia-jetpack” on Jetson. The new power modes are only available with the new flashing configuration. Note that the default flashing configuration has not changed, so users will need to use the new flashing configuration while flashing to enable the new power modes. The new flashing configuration to be used with flashing is listed below.
    jetson-orin-nano-devkit-super.conf

After flashing or updating to JetPack 6.2, run the following command to start the newly available ‘Super’ power modes:

MAXN SUPER mode on Jetson Orin Nano Modules

sudo nvpmodel -m 2

MAXN SUPER mode on Jetson Orin NX Modules

sudo nvpmodel -m 0

You can also select the MAXN SUPER and other power modes from the power mode menu at the top right corner of the GUI

JetPack 6.2 Components:

SDK Support:

  • DeepStream 7.1
  • Isaac ROS 3.2 (coming in Jan ‘25)
  • Holoscan 2.9 (coming in Jan ‘25)

Containers: Containers will soon be available on JetPack 6.2. We will update the announcement as containers are made available.

JetPack 6 Resources:

Some issues encountered. Firstly, JetPack 6.2 installed properly via the lastest SD card image. The firmware updated correctly on its own after a few reboots. The MAXN power setting is available. All Great. Summary of my system is here: system_info.txt (2.0 KB)

Issues encountered with Hello AI World. I attempted to build the project from source.

Problem #1: Pytorch and torchvision did not install properly. Error message from python as follows:

sky@jet:~$ python3
Python 3.10.12 (main, Nov  6 2024, 20:22:13) [GCC 11.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.10/dist-packages/torch/__init__.py", line 235, in <module>
    from torch._C import *  # noqa: F403
ImportError: libcudnn.so.8: cannot open shared object file: No such file or directory
>>> import torchvision
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'torchvision'
>>> 

Problem #2: the Image Classification tools are not working properly. None of the network models based on caffe are working. This is understood because the latest version of TensorRT does not support caffe models. I did try the “resnet18-tagging-voc” network which is NOT caffe but rather an ONNX model, but that fails too with a “Segmentation fault (core dumped)” error. Full jetson output is here:
image_classification_errors.txt (950.3 KB)

Questions:
(1) Are pip-wheel installers for Pytorch and Torchvision available for JetPack6.2 (36.4.3)?
(2) When will ONNX network versions be available for image classification and object detection?
(3) Segmentation error is occurring with ONNX models. TensorRT10.x is supposed to support ONNX. When will this be reconciled?

The Hello AI World has been a potent learning tool ever since my first Jetson Nano up to the Orin Nano devkit with JetPack 6.0. Now with the reduced Orin Nano price, and Sparkfun type people like myself who will buy them up (I own 4 Orin Nanos), I think its critical that the Hello AI World project work properly as its done in the recent past. Please prioritize this.

2 Likes

@shmaheshwari HI
Will Jetpack 5.1. x support super mode?
e.g. 5.1.4

I am unable to install JP 6.2. I had 6.1 working on the Orin Nano, but I tried to boot from a 6.2 sd card, and only got a UEFI shell. I changed the boot order to use the sd card first, but it was no help. I also tried using the sdkmanager from an ubuntu box. It recognized by Orin Nano, updated itself, and offered to install JP 6.2, but all I got was download errors. Any suggestions?

I am also struggling with getting Pytorch and Torchvision up and running. Did you find a solution?

1 Like

Step-2 in this link: Yolo11 support has instructions for installing pytorch 2.5 and torchvision 0.20.0.

Not sure where the official Orin Nano Devkit wheels reside though.