Inference fails to give detection when i change the different Deepstream 6.0 apps

cuda-11.3 + cuDNN-8.2
Quadro RTX 5000 dual GPU
Driver Version: 470.82.00
CUDA Version: 11.4
Ubuntu 18.04
python 3.6

Deepstream 6

I trained TAO 3.0 custom yoloV4 model
I successfully inference using : deepstream_tao_apps on both jetson Xavior and tesla dGPU

But when i run it on :
or normal deepstream

it gives no detection both using engine file or if i build it using cal.bin/etlt…

For DS issue, please create topic in DS forum .Thanks.

ok, Thank you

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.