run multi Inference programs in same time

Can i run multi Inference programs in same time? Such as program#1 run Car detection and program#2 run Face detection and so on.

Thanks

Hi,
This is supported. Please try DeepStream SDK4.0.2

We have sample config file source8_1080p_dec_infer-resnet_tracker_tiled_display_fp16_nano.txt and you can run

$ deepstream -c source8_1080p_dec_infer-resnet_tracker_tiled_display_fp16_nano.txt

If you need to run two inference models, you can run it like

$ deepstream-app -c config1.txt & deepstream-app -c config2.txt &

Thank you @DaneLLL for you response. But I want run 2 tensorrt inference models by my 2 python code separately but on same time not run 2 model by Deepstream.

Hi,
The reference samples are not 100% same as your usecase. You may try the samples first and then do customization. You can install all packages through SDK manager.

For python sample, please check
https://github.com/NVIDIA-AI-IOT/deepstream_python_apps

@DaneLLL thanks for you info, I’ll try its.