Custom Yolov4 model to run on deepstream + TensorRT

Please provide complete information as applicable to your setup.

**• Hardware Platform Jetson TX2
**• DeepStream 6.01
**• JetPack Version 4.6.1
**• TensorRT Version - 8.2
**• Issue Type( question)


I am new to Nvidia deepstream (and TensorRT).

I want to take a YoloV4 custom model that i have created and try to test it with 1) deepstream + Tensor RT 2) deepstream only.

What are the quickest and the relevant steps i need to do in order to check those issues ?


Zvika Shtorch

This looks more related to TAO. We are moving this post to the TAOforum to get better help.
Thank you.

You can refer to YOLOv4 — TAO Toolkit 3.22.05 documentation
or use GitHub - NVIDIA-AI-IOT/deepstream_tao_apps: Sample apps to demonstrate how to deploy models trained with TAO on DeepStream directly.

Thank you!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.