We are pleased to announce JetPack 4.6, a production release supporting Jetson AGX Xavier series, Jetson Xavier NX, Jetson TX2 series, Jetson TX1, and Jetson Nano.
JetPack 4.6 includes support for Triton Inference Server, new versions of CUDA, cuDNN and TensorRT, VPI 1.1 with support for new computer vision algorithms and python bindings.
In addition to l4t-base container image, new CUDA runtime and TensorRT runtime container images are released on NVIDIA NGC, which include CUDA and TensorRT runtime components inside the container itself, as opposed to mounting those components from the host. These containers are built to containerize AI applications for deployment. Note that the L4T-base container continues to support existing containerized applications that expect it to mount CUDA and TensorRT components from the host.
Highlights of JetPack 4.6 for Jetson Nano are:
Enhanced Jetson-IO tools to configure the camera header interface and dynamically add support for a camera using device tree overlays
Support for configuring for Raspberry-PI IMX219 or Raspberry-PI High Def IMX477 at run time using Jetson-IO tool on Jetson Nano 2GB, Jetson Nano and Jetson Xavier NX developer kits.
Support for Scalable Video Coding (SVC) H.264 encoding
Support for YUV444 8, 10 bit encoding and decoding
Currently there is no TX2 NX user guide. Is Jetpack 4.6 the production-grade version for TX2 NX?
After flashing TX2 NX with Jetpack 4.6 there is 250MB (out of 16GB) disk space left, there are plenty of unnecessary tools installed like Thunderbird, etc. (since it is Ubuntu desktop). Is there a way to build a slimmed-down OS with Jetpack?
The containers will be on NGC either today or tomorrow. So very soon!
TX2 NX user guide: do you mean developer kit user guide? TX2 NX is a production module that you can use with Jetson Xavier NX developer kit. So you can follow the Jetson Xavier NX Developer Kit User Guide here: https://developer.nvidia.com/jetson-xavier-nx-developer-kit-user-guide
Yes JetPack 4.6 is produciton grade version for TX2 NX
SVC encoding support is very interesting. Do any of the other codecs already support it in older Jetpacks, or is this the first implementation of SVC encoding on the Nano? I read it’s supposed to be part of VP8, but it doesn’t show up in the documentation I’ve seen so far.
JetPack 4.6 containers are live now on NGC!
As mentioned in the announcement, we have release 2 new containers this time, a cuda runtime container and a tensorrt runtime container. These containers include CUDA and TensorRT runtime components inside the container itself, as opposed to mounting those components from the host.
what is the reason for not supporting Jetson Nano in these 2 features
Image based Over-The-Air update
Flashing with initrd
both would be very valuable for production usage in field.
My company is using 2000 jetson nano devices in production (inside Police Cars). And having to flash them manually in field via USB while having them in Force Recovery mode is near impossible task to achieve.
Hi, guys. I’m trying to launch the L4T base container and I’m getting this error on my project:
`fatal error: NvInfer.h: No such file or directory.
Using the nvcr.io/nvidia/l4t-base:r32.5.0 version, I don’t have this error, but I need to update some staffs and I need to change it to r32.6.1
@AdrianoSantosPB is the error coming when you launch the L4T base container or when you are laynching your containarized application with L4T base image as your base image
...
Linux_for_Tegra/source/public/gst-nvarguscamera_src.tbz2
Linux_for_Tegra/source/public/gstjpeg_src.tbz2
bzip2: Data integrity error when decompressing.
Input file = (stdin), output file = (stdout)
It is possible that the compressed file(s) have become corrupted.
You can use the -tvv option to test integrity of such files.
You can use the `bzip2recover' program to attempt to recover
data from undamaged sections of corrupted files.
tar: Unexpected EOF in archive
tar: Unexpected EOF in archive
tar: Error is not recoverable: exiting now
The SHA256 checksum of the file is b86a23f26be6e91927c7af66537004cbff319ab9fc5fd8d284f9d4e90030b9df, and the size that I get after the download is 161774820 bytes.
Interesting – I’ve tried twice, from two different computers using two different networks. The sha256 hash (a checksum) is the same. On CentOS 7, it extracts fine, whereas on my NixOS, I get a checksum error. Perhaps a newer tar and bzip2 is more strict?
@AdrianoSantosPB i tried the l4t base container for 32.6.1 and launched it using below command and then ran tensorrt samples from within the container, and i did not come across any issue. Are you using the l4t base container for 32.6.1 on a Jetson device flashed with 32.6.1?
I am trying to install Triton Inference Server on jetson nano, jetpack 4.6.
I found this release for my situation.
But I couldnt figure out how should I install. There are 3 files for jetson device.
1- tritonserver2.12.0-jetpack4.6.tgz
2- v2.12.0/v2.12.0_ubuntu2004.clients.tar.gz
3- Source code (tar.gz)
Which one should I do install first and how? Then, how can ı deploy my models to inference server.
Would someone plase text what should I do step by step?