• Hardware Platform (Jetson / GPU)
x86-64 Ubuntu 20.04 LTS machine with Geforce GTX 3060
• DeepStream Version
6.1
• TensorRT Version
8.2.5.1
• NVIDIA GPU Driver Version (valid for GPU only)
515.48.07
• Issue Type( questions, new requirements, bugs)
glad to hear that the new version deepstream support parallel inference,I want to know whether it can run on deepstream 6.1.0 ,and where can I find a guidance to use it
There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks
No. It can only work on DeepStream 6.1.1 or above.
Please refer to NVIDIA-AI-IOT/deepstream_parallel_inference_app: A project demonstrating how to use nvmetamux to run multiple models in parallel. (github.com)
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.