How to move the onnx file from AGX to nano properly? (dusty-nv inference)

Hi,

I want to train the custom object detection in AGX and move the onnx to nano, but the nano stuck at
“[TRT] Could not register plugin creator - ::FlattenConcat_TRT version 1”
Other than the
labels.txt
ssd-mobilenet.onnx
in the models folder, what do I need to move to the nano?

Thx

Hi,

If I add this file to the models folder
ssd-mobilenet.onnx.1.1.8001.GPU.FP16.engine

It said hashRead failed.CRC-32

Thx

Hi,

Do you use the same source and TensorRT version on Nano?
The output indicates some implementation is missing.

Thanks.

Hi,

Yes, I did rename the existing jetson-inference folder to jetson-inference-old in both AGX and nano,
and git clone the inference and build them
Thx

Hi,

Does the inference start after the message?
Based on the discussion on GitHub, the error occurs in the first build.

Thanks.

Hi,

I used the pth file from AGX and did the onnx conversion in nano and it works now.
Thx