Cannot convert onnx to trt on Jetson Xavier NX 16GB Production Module

Description

I’ve been trying to convert .onnx models to .trt models for object detection on my new Jetson Xavier NX 16GB production version but I get this error:

Error[2]: [utils.cpp::checkMemLimit::380] Error Code 2: Internal Error (Assertion upperBound != 0 failed. Unknown embedded device detected. Please update the table with the entry: {{1794, 6, 16}, 12653},)

I bought the Xavier from this link and the specs are there: https://www.seeedstudio.com/Jetson-20-1-H2-p-5329.html

Since I have the same error as this person TensorRT with Jetpack 4.6.2 on Xavier NX emmc 16gb version I tried to upgrade jetpack to 5.0.1 but now I get a memory error when I try to do

sudo apt dist-upgrade

(see 1:20 of this tutorial Upgrade NVIDIA Jetson JetPack 5 - YouTube)

since the device came with a 16GB eMMC 5.1 of which 14.7GB are already used. Almost all of that is due to the OS and pre-installed modules which I hesitate to remove, meaning that I simply cannot upgrade to jetpack 5.0.1 either. To make matters worse, since this is the production variant there is no microSD slot where I can simply boot up a 64GB disk and run everything from there…what do I do? Is there a way to increase the storage since 16GB is so, so limited?

Environment

TensorRT Version: 8.2.1
GPU Type: Jetson Xavier NX 16GB
CUDA Version: 10.2
CUDNN Version: 8.2.1
Operating System + Version: L4T 32.7.1
Python Version (if applicable): 3.6
PyTorch Version (if applicable): 1.8.0

Relevant Files

download this folder (it includes the .onnx file)
https://drive.google.com/drive/folders/1BEzGUTiZWg0fuasd2DFpr-xkgxngkJgp?usp=sharing

Then run

python export.py -o './best.onnx' -e ./best-nms.trt -p fp16

while in this directory

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.

In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

I have just shared the entire code repository needed to work along with the .onnx model to convert. The onnx model itself is guaranteed to work - I literally did this with a Jetson Nano developer kit running the same setup (jetpack 4.6.1) and it worked perfectly. My issue lies exclusively with how the Xavier NX 16GB is incompatible with Jetpack 4.6.1 - and how I can’t fix that by upgrading jetpack to 5.0 because of storage issues.

Hi,

We couldn’t reproduce the above error on Ubuntu and TensorRT version 8.4.2, and could successfully build the TensorRT engine.
We are moving this post to the Jetson Xavier NX forum to get better help.

Thank you.

Hi,

This is a known issue for XaiverNX 16GiB with TensorRT 8.2.
Please downgrade to TensorRT 8.0+JetPack 4.6 or upgrade to TensorRT 8.4+JetPack 5.0.2 to avoid the error.

I’m checking the storage issue with our internal team.
Will share more information with you later.

Thanks.

Hi,

You may consider buying an NVMe PCIe SSD card and moving the rootfs to it.
Thanks.

I am also considering this solution. Do you have any tutorial articles or videos explaining how to do this? All of the ones I could find are to do with DEV KITS with removable SD storage not the built-in eMMC production modules…I’m completely new to this and very lost.

Hi,

Please check the below document for information:

https://docs.nvidia.com/jetson/archives/r35.1/DeveloperGuide/text/SD/FlashingSupport.html#flashing-to-an-nvme-drive

Thanks.

Thanks, I’ve been able to go through that and also find a tutorial made by the manufacturers of the kit I bought at this link:

However, while going through the process the NVME is not being recognized by the device so I’m stuck here again. Is the particular SSD model I bought incompatible and do you have suggestions for alternatives that will work? The SSD I bought is this one:

Update: new SSD bought from this link works:

The previous one does not.

Good to know this.
Thanks for the update.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.