Downgrade Jetpack 5.0.2 to JetPack 5.0 Developer Preview

Hello,

I currently have a Jetson Xavier NX (developer kit) that is flashed with Jetpack version 5.0.2. When running a code that converts a PyTorch model to TensorRT using the GitHub - NVIDIA-AI-IOT/torch2trt: An easy to use PyTorch to TensorRT converter library, my ram memory goes up to its maximum at one point and the code that is executed is killed.

I have in mind that the conversion worked without any problem with JetPack 5.0 or 5.0.1. I would like to flash my device to one of these two versions to try if the conversion works.

When I run the command “sdkmanager --query noninteractive --logintype devzone”, the JetPack 5.0.1 and Jetpack 5.0 version are not displayed. Therefore, I cannot run the command “sdkmanager --cli install --logintype devzone --product Jetson --version 5.0.1 --targetos Linux --host --target JETSON_XAVIER_NX_TARGETS --flash all”. I have the error Jetson version 5.0.1 does not support Linux target OS on JETSON_XAVIER_NX_TARGETS.

Do you have any idea how I can downgrade my Jetpack version?

Thanks in advance.

PS: I’m on Ubuntu 20.04

Hi,

Do you have an ONNX model that can share with us?
We want to check why it takes more memory on JetPack 5.0.2.

For SDKmanager, it should work with the user interface.
Do you need the command line mode?

Thanks.

1 Like

Yes, the model used is available here: GitHub - NVIDIA-AI-IOT/trt_pose: Real-time pose estimation accelerated with NVIDIA TensorRT (model named densenet121_baseline_att_256x256_B) (84 mb). If you prefer, here is the download link: densenet121_baseline_att_256x256_B_epoch_160.pth - Google Drive .

Actually, for the user interface, only version 5.0.2 is available (not 5.0 or 5.0.1).

Thanks a lot!

Example code: trt_pose/live_demo.ipynb at master · NVIDIA-AI-IOT/trt_pose · GitHub

Hi,

Would you mind adding some swap memory to see if it helps?

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.