Integrate MobileNetSSD on Deepstream

Hi ,
I am new here. I would like to ask if I want to use my training model (MobileNetSSD)which is written in caffe. How can I integrate my model into deepstream pipeline using python.

*Tesla T4
*Deep Stream:5.0
*TensorRT Version:7.0

Thanks,

Hi,

Please check this GitHub for our python-based Deepstream sample:

For SSD-based model, please check deepstream-ssd-parser for information.
Thanks.

Hi @AastaLLL
Thanks for your feedback. If I change the backbone to mobileNet v3. Do I need to implement the “nvdsinfer_custom_impl” plugin myself? What is the limitation using python-binding? Will using C++ give me more flexibility when developing applications using Deepstream?

Really thanks for your help.

Hi,

If the output bounding box format stay the same, you can use the ssd parser directly.

C++ can allow some customized plugin implementation.
If this is an option for you, it’s more recommended to use C++ interface directly.

Thanks.

Hi @AastaLLL,

Thanks for your feedback. When running [deepstream-ssd-parser], is the program using the TensorRT to accelerate the inference or only use triton inference server?

Thanks,

Hi,

The sample use TensorRT for inference.

Thanks.