Does deepstream 6.1.0 support parallel inference?

• Hardware Platform (Jetson / GPU)
x86-64 Ubuntu 20.04 LTS machine with Geforce GTX 3060
• DeepStream Version
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
glad to hear that the new version deepstream support parallel inference,I want to know whether it can run on deepstream 6.1.0 ,and where can I find a guidance to use it

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.

No. It can only work on DeepStream 6.1.1 or above.

Please refer to NVIDIA-AI-IOT/deepstream_parallel_inference_app: A project demonstrating how to use nvmetamux to run multiple models in parallel. (

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.