We are pleased to announce the JetPack 4.4 production release supporting Jetson AGX Xavier series, Jetson Xavier NX, Jetson TX2 series, Jetson TX1, and Jetson Nano. (JetPack 4.4 replaces JetPack 4.3 as the latest production release.)
Please read an important NOTE at the end of this announcement
The JetPack 4.4 production release builds on top JetPack 4.4 Developer Preview, and includes production versions of TensorRT 7.1 and cuDNN 8.0.
Along with JetPack 4.4, we are introducing a web based Power Estimator tool to simplify creation of custom nvpmodel power profiles for Jetson.
Support for Dynamic Frequency Scaling (DFS) for Video Image Compositor (VIC) using actmon
SE (Security Engine) samples to demonstrate hardware backed authentication and encryption capabilities of Jetson TX2 series, Jetson AGX Xavier and Jetson Xavier NX modules.
Utility to fuse multiple Jetson modules simultaneously
Option to specify APP partition size on the microSD card during initial configuration at first boot of Jetson Nano Developer Kit
JetPack 4.4 components:
L4T R32.4.3
CUDA 10.2
cuDNN 8.0.0
TensorRT 7.1.3
VisionWorks 1.6
OpenCV 4.1
Vulkan 1.2
VPI 0.3 (Developer Preview)
Nsight Systems 2020.2
Nsight Graphics 2020.1
Nsight Compute 2019.3
Existing installations of JetPack 4.4 Developer Preview and JetPack 4.3 can be upgraded in-place to the JetPack 4.4 production release without re-flashing the device. For more information about upgrading JetPack via the Debian package management tool, please refer to the JetPack documentation here.
Note: Jetpack/L4T upgrade using debian package management tool is now fixed and reenabled.
We had temporarily disabled the feature of upgrading JetPack or L4T using debian package management tool due to an issue we found in one of our debian packages causing the device to not boot properly. We have fixed this issue now and have reenabled this feature. We are sorry for the inconvenience caused.
After running apt update this morning on my Xavier NX, I saw the host of upgrades available for the various nvidia-l4t-* packages and happily ran apt upgrade to get the latest versions. (I was upgrading from 32.4.2.) Upon power cycling, my install no longer boots. I’m troubleshooting now, but be advised there may still be some kinks in the package-manager-driven upgrade system.
After finding this post (which I was unaware of initially) and following the links regarding upgrading from a previous JetPack version, I see the instructions follow apt upgrade with apt install nvidia-jetpack, which I did not do. I’m unsure if this is related to my inability to boot.
Worst case scenario I’ll just reflash with the new version directly, but I felt like someone ought to know.
If you are upgrading from JetPack 4.4 DP, you will need to upgrade L4T as you did
sudo apt update
sudo apt upgrade
and then install nvidia-jetpack again
sudo apt install nvidia-jetpack
We will be correcting the documentation in L4T documentation to mention this extra step (install nvidia-jetpack) in case you are upgrading full JetPack.
Thank you. I’ve reflashed my SD card and will know to add that step in the future; luckily not much work was lost.
I have a lingering worry this may trip others up before long - folks who will see package updates in the apt repos and proceed to run upgrades without ever thinking the consult the L4T documentation. I don’t know enough about apt to think of an obvious fix for that. Maybe a big warning screen as a post-upgrade hook?
I had created a Docker Image based on deepstream-l4t:5.0-dp-20.04 to run the deepstream python apps demos on my xavier nx with Jetpack4.4 DP. It worked perfectly. But when I switched to the 4.4. release (I flashed a new SD card) running the same code (/opt/nvidia/deepstream/deepstream-5.0/sources/python/apps/deepstream-test1-usbcam# python3 deepstream_test_1_usb.py /dev/video0) I got the following error and do not know how to fix it:
InferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1588> [UID = 1]: Trying to create engine from model files
Warning, setting batch size to 1. Update the dimension after parsing due to using explicit batch size.
ERROR: [TRT]: Network has dynamic or shape inputs, but no optimization profile has been defined.
ERROR: [TRT]: Network validation failed.
ERROR: Build engine failed from config file
I switched back to the 4.4. DP and with my image from dockerhub, it still worked there.
DeepStream 5.0 Developer Preview is not supported on JetPack 4.4 production release. For using DeepStream 5.0 DP, you will need to be on JetPack 4.4 DP.
JetPack 4.4 production release supports DeepStream 5.0 production release (expected end of this month)
Thank you and sorry to bother you again.
JetPack 4.4 support for TensorFlow in python.
Does JetPack 4.4 support for TensorFlow in C, is there any C API?