JetPack 4.6 Production Release with L4T 32.6.1

We are pleased to announce JetPack 4.6, a production release supporting Jetson AGX Xavier series, Jetson Xavier NX, Jetson TX2 series, Jetson TX1, and Jetson Nano.

JetPack 4.6 includes support for Triton Inference Server, new versions of CUDA, cuDNN and TensorRT, VPI 1.1 with support for new computer vision algorithms and python bindings.

In addition to l4t-base container image, new CUDA runtime and TensorRT runtime container images are released on NVIDIA NGC, which include CUDA and TensorRT runtime components inside the container itself, as opposed to mounting those components from the host. These containers are built to containerize AI applications for deployment. Note that the L4T-base container continues to support existing containerized applications that expect it to mount CUDA and TensorRT components from the host.

Highlights of JetPack 4.6 for Jetson Nano are:

  • Enhanced Jetson-IO tools to configure the camera header interface and dynamically add support for a camera using device tree overlays
  • Support for configuring for Raspberry-PI IMX219 or Raspberry-PI High Def IMX477 at run time using Jetson-IO tool on Jetson Nano 2GB, Jetson Nano and Jetson Xavier NX developer kits.
  • Support for Scalable Video Coding (SVC) H.264 encoding
  • Support for YUV444 8, 10 bit encoding and decoding
  • Direct downloadable links to JetPack and L4T debian packages for Jetson

Visit JetPack 4.6 and L4T 32.6.1 page for more details.

JetPack 4.6 components:

  • L4T R32.6.1
  • CUDA 10.2
  • cuDNN 8.2.1
  • TensorRT 8.0.1
  • VisionWorks 1.6
  • OpenCV 4.1.1
  • Vulkan 1.2
  • VPI 1.1
  • Nsight Systems 2021.2
  • Nsight Graphics 2021.2
  • Nsight Compute 2019.3

Install JetPack 4.6 using SDK Manager or SD card image (for Jetson Nano 2GB Developer Kit, Jetson Nano Developer Kit and Jetson Xavier NX Developer Kit) or upgrade via Debian package management tool (refer to the instructions here)

Notes: (We will update as items below are published)

Refer to Jetson Roadmap page for roadmap on Jetson hardware and software.

Please register to following webinars:

1 Like

Jetson nano 2G can’t download jetpack 4.6
other model work

404 - Not Found

@winit_a i dont see any issue with 2GB sd card image and able to download fine. Is this link not working for you? -

thank you .
download success

A couple of things that we noticed

  • When will you release the docker container L4T 32.6.1 on NVIDIA NGC ?
  • Currently there is no TX2 NX user guide. Is Jetpack 4.6 the production-grade version for TX2 NX?
  • After flashing TX2 NX with Jetpack 4.6 there is 250MB (out of 16GB) disk space left, there are plenty of unnecessary tools installed like Thunderbird, etc. (since it is Ubuntu desktop). Is there a way to build a slimmed-down OS with Jetpack?

Thank you!


1 Like

SVC encoding support is very interesting. Do any of the other codecs already support it in older Jetpacks, or is this the first implementation of SVC encoding on the Nano? I read it’s supposed to be part of VP8, but it doesn’t show up in the documentation I’ve seen so far.

Triton Inference Server release supporting JetPack 4.6 is live now!

Visit server/ at main · triton-inference-server/server · GitHub

JetPack 4.6 containers are live now on NGC!
As mentioned in the announcement, we have release 2 new containers this time, a cuda runtime container and a tensorrt runtime container. These containers include CUDA and TensorRT runtime components inside the container itself, as opposed to mounting those components from the host.

L4T base container: NVIDIA NGC
CUDA Runtime Container: NVIDIA NGC
Tensorrt Runtime Container: NVIDIA NGC

what is the reason for not supporting Jetson Nano in these 2 features

  • Image based Over-The-Air update

  • Flashing with initrd

both would be very valuable for production usage in field.
My company is using 2000 jetson nano devices in production (inside Police Cars). And having to flash them manually in field via USB while having them in Force Recovery mode is near impossible task to achieve.

Hi, guys. I’m trying to launch the L4T base container and I’m getting this error on my project:
`fatal error: NvInfer.h: No such file or directory.
Using the version, I don’t have this error, but I need to update some staffs and I need to change it to r32.6.1

@adrianokeufw is the error coming when you launch the L4T base container or when you are laynching your containarized application with L4T base image as your base image

When I’m launching your containerized application with L4T base image as a base image

I’ve downloaded the source code from , but it looks like it’s corrupted:


bzip2: Data integrity error when decompressing.
        Input file = (stdin), output file = (stdout)

It is possible that the compressed file(s) have become corrupted.
You can use the -tvv option to test integrity of such files.

You can use the `bzip2recover' program to attempt to recover
data from undamaged sections of corrupted files.

tar: Unexpected EOF in archive
tar: Unexpected EOF in archive
tar: Error is not recoverable: exiting now

The SHA256 checksum of the file is b86a23f26be6e91927c7af66537004cbff319ab9fc5fd8d284f9d4e90030b9df, and the size that I get after the download is 161774820 bytes.

@jkt1 i just tried dowbloading and extracting and i dont see the issue. Can you check again?

@hlacik you can use Debian Package Based OTA that we support for Jetson Developer Kits. You will need to customzie some packages based on your carrier board and host the packages on your OTA server, but it is possible to update your Jetson Nano using the Debian package based OTA. How to Install JetPack :: NVIDIA JetPack Documentation

Interesting – I’ve tried twice, from two different computers using two different networks. The sha256 hash (a checksum) is the same. On CentOS 7, it extracts fine, whereas on my NixOS, I get a checksum error. Perhaps a newer tar and bzip2 is more strict?

@adrianokeufw i tried the l4t base container for 32.6.1 and launched it using below command and then ran tensorrt samples from within the container, and i did not come across any issue. Are you using the l4t base container for 32.6.1 on a Jetson device flashed with 32.6.1?

sudo docker run -it --rm --net=host --runtime nvidia -e DISPLAY=$DISPLAY -v /tmp/.X11-unix/:/tmp/.X11-unix


I am trying to install Triton Inference Server on jetson nano, jetpack 4.6.

I found this release for my situation.

But I couldnt figure out how should I install. There are 3 files for jetson device.
1- tritonserver2.12.0-jetpack4.6.tgz
2- v2.12.0/v2.12.0_ubuntu2004.clients.tar.gz
3- Source code (tar.gz)

Which one should I do install first and how? Then, how can ı deploy my models to inference server.

Would someone plase text what should I do step by step?