JetPack 6.2.1/Jetson Linux 36.4.4 is now live

We are pleased to announce the production release of JetPack 6.2.1. JetPack 6.2.1 packages Jetson Linux 36.4.4 with Linux Kernel 5.15 and Ubuntu 22.04 based root file system. JetPack 6.2.1 is a minor upgrade over JetPack 6.2. This release supports all NVIDIA Jetson Orin modules and developer kits.

What’s new in JetPack 6.2.1

  • Added support for using Hardware Security Module (HSM) to sign boot images.
  • Resolved minor bugs, which improved the success rate of flashing with SDK Manager.
  • Addressed compatibility issues between JetPack 6.2 and the latest Docker release (v28.0.x).

You can install JetPack 6.2.1 with any of the methods below

  • SDK Manager: You can do a fresh install of JetPack 6.2.1 using SDK Manager.
  • Debian Package: If you have JetPack 6 already installed on Jetson AGX Orin Developer Kit or Jetson Orin Nano Developer Kit, you can upgrade to JetPack 6.2.1 using APT. Refer to the steps here.
  • SD Card: If you are using Jetson Orin Nano Developer Kit, you can download the SD Card image from JetPack 6.2.1 page and use Balena Etcher to prepare the SD Card with JetPack 6.2.1
  • Manual Flashing: If you prefer to install using the command line, you can flash Jetson device from a linux host by following steps here. Once Jetson Linux is flashed, you can install the compute stack using SDK Manager (using linux host) or by running “sudo apt update” followed by “sudo apt install nvidia-jetpack” on Jetson.

Note: JetPack 6.2.1 packages the same libraries, components and supports the same SDKs as JetPack 6.2.

JetPack 6.2.1/ Jetson Linux 36.4.4 resources
JetPack 6.2.1 SDK Page
Jetson Linux 36.4.4 Page
Jetson Linux 36.4.4 Release Notes
Jetson Linux 36.4.4 Developer Guide

1 Like

This is great but it’s killed pytorch for me. Has anyone got a wheel to install please?

4 Likes

Just asking only. With this:

Does this means we can finally install SDK manager on Windows 10/11 machines, and then use it to setup our Jetson Orin Nano Super device now?

Have you fixed it? it drives me crazy and I dont know how to use Pyturch now.

No I gave up for a bit. To be honest I am a hobbyist not a dev and whilst my Linux skills are OK, it has taken a long time to get some things working. I did get open-webUI running with Ollama and GPU support by installing both in Docker. Before I had Ollama running natively and could not get this to work. I have set up a Cloudflare tunnel and can now access an LLM running on my Jetson at home remotely. I just play a bit and try to move forward whenever I have time.