How should I combine tao with deepstream?

Does the inference of the nvinfer plugin support all pre-trained models in Nvidia TAO?
How should I combine tao with deepstream?

Theoretically, DeepStream SDK can support all TAO models and other models. Some are supported directly while some should be supported by customization. Currently the DeepStream samples do not cover all TAO models.

For example, the peoplenet in TAO is the network .
It should be able to support deepstream directly.I downloaded the pre-training training model, which has labels.txt, ***.etlt, and network model related txt, how should I use it to deepstream?
I would like to have a case like deepsteam-test.

This is our TAO model sample app which tells the users how to integrate TAO models into DeepStream. NVIDIA-AI-IOT/deepstream_tao_apps: Sample apps to demonstrate how to deploy models trained with TAO on DeepStream (github.com), the peopleNet pre-trained model sample is included.

custom-lib-path=… /… /post_processor/libnvds_infercustomparser_tao.so

Where should I go to download libnvds_infercustomparser_tao.so? I can’t find it.

I found it, it’s in the “/deepstream_tao_apps-master/post_processor” directory, and it’s open source and needs to be compiled. Thanks!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.