Is available NVIDIA custom inference library for Yolov8 yet?

Hi everybody,

Could anybody run yolov8 exported directly from yolo export command? and could you use it entirely on DLA engine?

I was trying without success.

Could you give me a guide step by step to successfully get running Yolov8 on AGX Orin 64GB and DLA?



DLA needs to be used through TensorRT API.
Do you run your model successfully with TensorRT?


Hi AastaLLL,

Yes, the model runs perfectly with TensorRT. But when I activate DLA, I get a lot of warnings saying that such a layer doesn’t run on DLA and will be run on GPU, not DLA.


Yes, since DLA is a hardware-based inference engine, the support is relatively limited compared to GPU.
Please find the support matrix of DLA below:

Below is our tutorial for DLA inference for your reference:


@francisco23 check this: [jetson] Running yolov8 classfication model in DLA #1 – Why DLA? | by DeeperAndCheaper | Medium

Thanks AastaLLL, I will check it.

Pefect jhonnynuca14, thank you. When I get running on DLA I will give a feedback.-

also you have another alternative repository:

this has efficient nms etc

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.