Yolov11 Inferencing Example in C++ for Jetpack6.2

Has Nvidia developed a C++ Object Detection example with Yolov11 or v8? Most of the inferencing examples are done in Python, not C++. Ultralytics says they don’t have a C++ example. My robotics project requires very low latency, so I need to implement the Object Detection in C++.

I’m thinking the quickest and cleanest approach is to use the existing detectnet project from Hello AI World. But I would first need to modify the Yolo inputs and outputs to match what is expected by detectnet (not sure what those are), then bring in the model as an ONNX network (this part is straightforward).

Hi,

Does Deepstream work for you?
If yes, you can find the below topic for a YOLOv8 C++ example:

Thanks.

Tried installing Deepstream7.1 per the Nvidia documentation . It was quite involved and one of the examples ran extremely slow. Something is off. Overall, Deepstream is overkill for my Object Detection goals.

The inference tools clearly documented in the Hello AI World is the right fit. Unfortunately its not fully operational in JetPack 6.2 as it mostly was in 6.0. Is @dusty-nv still with Nvidia? He was a great substantive helper and Hello AI World writer.

It’d be extremely beneficial if your team can get the Hello AI World project working fully in 6.2 or the next version. I highly recommend getting rid of all the Caffe-based networks and replacing them with ONNX versions, and also include Yolo11 networks in the example set.

Hi,

Our team focuses on the generative AI tutorial.
So the resources for updating the jetson-inference are relatively limited.

But the l4t-jetpack:r36.3.0 container can work on JetPack 6.2.
Could you try if jetson-inference can work on top of it?

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.