Integrating Custom trained yolov4 in Deepstream 5.1

Please provide the following information when requesting support.

• Hardware: Triton Inference Server
• Network Type : YOLOv4

Hi , I have custom trained yolov4 on tao toolkit version 3.22.05. The model was trained successfully . Now i am interested in integrating it with existing deepstream 5.1 sdk which in already in production.

Can you guide as to how to go ahead with this integration as i have tried the following but it failed

  1. Exported the model , however could not find compatible config file for integrating it with deepstream 5.1. tried by changing parameters of yolov3 config. However it gives an error

  2. Downgrading the nvidia-tao to make the model compatible with deepstream5.1 and tried exporting however export code does not work
    (Do i need to retrain the model with downgraded version of tao so that it is compatible with deepstream 5.1)

Kindly suggest

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.

Please try to check if it works after git clone "-b release/tao3.0 " branch.
$ git clone -b release/tao3.0 GitHub - NVIDIA-AI-IOT/deepstream_tao_apps: Sample apps to demonstrate how to deploy models trained with TAO on DeepStream

For the config file, refer to
deepstream_tao_apps/pgie_yolov4_tao_config.txt at release/tao3.0 · NVIDIA-AI-IOT/deepstream_tao_apps · GitHub

Please share the log.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.