Running New ONNX Detection Models on Orin Nano Devkit with C++

I’m interested in running new Object Detection ONNX models in C++ beyond those featured in the Hello AI World project. My Orin Nano devkit is loaded with Jetpack6.0 (more details: system_info2024.txt (2.4 KB)). The detection networks featured in the Hello AI World project are running properly, except for the TAO models. Now I want to run other detection models such as YOLO v3, v4, etc found in the ONNX github. What is the command line method for running external models?

With the Jetson prebuilt networks, I’m able to run C++ object detection in this manner:

./detectnet --network=ssd-mobilenet-v2 images/peds_0.jpg images/test/output.jpg 

However, I’m not able to run the ssd-mobilenet-v2 example with the “–model” argument, which I believe is the argument needed for running new models. If I can’t make this work with a prebuilt, there’s no way I can make it work on a new ONNX model. Here’s the command I tried:

./detectnet --model=/home/jet/jetson-inference/data/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff images/peds_0.jpg images/test/output.jpg

The error message is:

[TRT]    
[TRT]    3: Cannot find binding of given name: data
[TRT]    failed to find requested input layer data in network
[TRT]    device GPU, failed to create resources for CUDA engine
[TRT]    failed to create TensorRT engine for /home/jet/jetson-inference/data/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff, device GPU
[TRT]    detectNet -- failed to initialize.
detectnet:  failed to load detectNet model

Full detectnet output is here: detect_error.txt (10.5 KB)

The C++ detectnet program has many arguments (detectnet_help.txt (6.5 KB)). What am I missing for running a network using the “–model” argument?

Hi,

Suppose you should have a [NAME].onnx file instead of the [NAME].uff file.
Once you get the [NAME].onnx, please try to feed it into the detectnet binary with:

./detectnet --model=[NAME].onnx -input-blob=INPUT --output-cvg=COVERAGE -output-bbox=BOXES ...

Where INPUT, COVERAGE, and BOXES are the names of input, coverage/confidence output, and bounding output layer respectively.

You can find some details in the below source:

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.