Create custum model yolov3 / onnx

Hello

Please, In order to have a personalized model, I followed this tutorial
PyLessons, then I convert the output to ONNX model to use with deepstream5
converting to onnx with TensorRT/ONNX - eLinux.org .3F, when I did a test I didn’t see any detection of the desired class,
My question: by doing all these conversions we lose the reliability of our network?
What other network format besides onnx that we can use with deepstream5

PS: output network training : yolov3_custom.data-00000-of-00001 and yolov3_custom.index, I managed to have some detection with this

Thanks,

Hi @sylia,

when I did a test I didn’t see any detection of the desired class,

Have you confirmed that the pre-&post-processing are right?

Hi
Yes, I used this tutorial to convert the output of my training
in onnx TensorRT/ONNX - eLinux.org .3F, then I adapted the onnx network entry to be compatible with deepstream5, I succeeded in generating the engine .engin, by against no detection … what I also noticed it is very slow .

Thank you

what’s the last layer of your yolov3 model?

208x208x64

Sorry for confusion! I mean what’s the layer type? NMS?
How did you add pre-&post- processing?
I don’t think the probem is on the onnx model converted following TensorRT/ONNX - eLinux.org, I suspect it’s caused by pre-&post- processing code.

Yes it’s NMS
indeed, when I tried to do a test with onnx deepstream5, I got an error message of this type from the YOLO library (nvdsinfer_custom_impl_Yolo):
dpxcam-app: nvdsparsebbox_Yolo.cpp: 255: bool NvDsInferParseYoloV3 (const std :: vector &, const NvDsInferNetworkInfo &, const NvDsInferParseDetectionParams &, stdInferParseDetectionParams &, stdInferetInfero & constdject :: vector & vector :: vector <std :: vector > &): Assertion `layer.inferDims.numDims == 3 'failed.
I tried to adjust it with the suggestions of people on the forum but without success, then I deleted this line and I recompiled the yolo library (a little stupid… sorry) it works but without detection of the desired class

Please, is there another way to convert custom yolov3.index yolov3_custom.data-00000-of-00001 to another form of network compatible with deepstream5 ?
Thanks

From your above description, seems the problem may be in your model itself because of the error from nvdsparsebbox_Yolo.cpp or the pre-and post-processing.
So, sorry! I don’t think there is a quick way that can walk around these two issues by converting the model.

And, ONNX is well supported by TensorRT and DeepStream.

I HAVE SOLVED THE PROBLEM WITH THIS PROJECT

Thanks

1 Like