ERROR: <create_parallel_infer_bin:1199>: create_parallel_infer_bin failed

• Hardware Platform (Jetson / GPU)
x86-64 Ubuntu 20.04 LTS machine with Geforce GTX 3060
• DeepStream Version
6.1.0
• TensorRT Version
8.2.5.1
• NVIDIA GPU Driver Version (valid for GPU only)
515.48.07
• Issue Type( questions, new requirements, bugs)
when I run the deepstream_parallel_inference_app, there is an error coming up as below:
src_ids:0;1;2
src_ids:1;2;3
src_ids:1;2;3
** ERROR: <create_parallel_infer_bin:1010>: Failed to create ‘infer_bin_muxer’
** ERROR: <create_parallel_infer_bin:1199>: create_parallel_infer_bin failed
creating parallel infer bin failed
Quitting
App run successful
my deepstream version is 6.1.0, is there anything wrong with my configuration?

You should use Deepstream 6.1.1 for deepstream_parallel_inference_app
nvdsmetamux plugin is released from DS 6.1.1

I used to develop applications using deepstream-app, but now parallel inference is only supported on this github project, how should I do to migrate this to deepstream-app ?

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

deepstream_parallel_inference_app follows the same APIs as deepsteam-app. Which deepstream-app feature do you want to port to deepstream_parallel_inference_app? They are all open source, you can read the details.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.