Questions about how to deploy yolov4_tiny in deepstream

I referenced @kayccc YOLOV4 example in DeepStream - #4

This link cannot handle the problem of yolov4_tiny. I want to use the darknet model of v4_tiny or the pytorch model in deepstream. I found that the darknet model of yolov4_tiny and the model trained by pytorch cannot be converted to onnx. How do I deal with it?

Sorry for the late response, we will do the investigation soon.

Hi,
Regarding exporting the model to onnx, you could consult in the TRT forum or the related github.

Thanks!

Thank you for your attention, if you find a good solution, please let me know, thank you

Sure~

@18981275647

i find a repo can work, from darknet to onnx to tensorrt , could you try to verify it?
we find a github repo about yolov4-tiny in https://github.com/linghu8812/tensorrt_inference/tree/master/Yolov4.
then we download the repo :

git clone https://github.com/linghu8812/tensorrt_inference.git
then download the weight of yolov4-tiny follow the readme.md of the repo:

wget https://github.com/AlexeyAB/darknet/releases/download/darknet_yolo_v4_pre/yolov4-tiny.weights
follow the readme,convert the model to onnx:
python3 export_onnx.py --cfg_file cfg/yolov4-tiny.cfg --weights_file yolov4-tiny.weights --output_file yolov4-tiny.onnx --strides 32 16 --neck FPN
use trtexec to convert the onnx model to trt plan:
trtexec --onnx=yolov4-tiny.onnx --saveEngine=yolov4-tiny.plan --verbose

@18981275647
and i found another repo about this, you can also try it.

Tianxiaomo/pytorch-YOLOv4: PyTorch ,ONNX and TensorRT implementation of YOLOv4 (github.com)

Thanks, but what I need is the v4-tiny conversion of pytorch training