Creating multi source bin failed

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Orin Development Kit
• DeepStream Version 7.1
• JetPack Version (valid for Jetson only) 6.1
The application deepstream_parallel_inference_app is used for my application. The app run/stop is controlled using mqtt command.
First time app is running, tried to stop app. The app stopped properly.

Message arrived
	topic: 'rectitude'
	payload: 'quit'

Deepstream Quitting
Returned, stopping playback
Processing frame number = 68	
Processing frame number = 69	
Processing frame number = 70	
Deleting pipeline
[NvMultiObjectTracker] De-initialized
[NvMultiObjectTracker] De-initialized
[NvMultiObjectTracker] De-initialized
App run successful

That means deepstream_parallel_inference_app stopped properly.
But when I send mqtt command to rerun the app, I have the following message. The application crushed is because of creating multi source bin failed.
All source bins are stopped properly.
Why it failed?
The whole message came out in the rerun the application are

src_ids:0;1;2
Unknown key enable-batch-process for tracker
Unknown key enable-past-frame for tracker
src_ids:1;2;3
Unknown key enable-batch-process for tracker
Unknown key enable-past-frame for tracker
src_ids:1;2;3
Unknown key enable-batch-process for tracker
Unknown key enable-past-frame for tracker
** ERROR: <create_multi_source_bin:1511>: Failed to create element 'multi_src_bin'
** ERROR: <create_multi_source_bin:1608>: create_multi_source_bin failed
creating multi source bin failed
Deepstream Quitting
App run successful

What could be wrong? The application is stopped using the command g_main_loop_quit(main_loop);

Why the following messages didn’t come out during app stop running?

nvstreammux: Successfully handled EOS for source_id=0
nvstreammux: Successfully handled EOS for source_id=2
nvstreammux: Successfully handled EOS for source_id=1
nvstreammux: Successfully handled EOS for source_id=3

The whole logs are here.

atic@ubuntu:/opt/nvidia/deepstream/deepstream-7.1/sources/apps/sample_apps/deepstream_reference_apps/deepstream_parallel_inference_app/tritonclient/sample$ ./apps/deepstream-parallel-infer/deepstream-parallel-infer -c configs/apps/rectitude/rectitude_main.yml
Connecting to the MQTT server 'mqtt://localhost:1883'...
Connection success

Subscribing to topic 'rectitude'
	for client paho_cpp_async_subscribe using QoS1

Press Q<Enter> to quit

Subscription success for token: [1]
	token topic: 'rectitude', ...

Message arrived
	topic: 'rectitude'
	payload: 'Hello World!'

Message arrived
	topic: 'rectitude'
	payload: 'Hi there testing 123!'

Message arrived
	topic: 'rectitude'
	payload: 'Is anyone listening?'

Message arrived
	topic: 'rectitude'
	payload: 'test'


before run 
src_ids:0;1;2
Unknown key enable-batch-process for tracker
Unknown key enable-past-frame for tracker
src_ids:1;2;3
Unknown key enable-batch-process for tracker
Unknown key enable-past-frame for tracker
src_ids:1;2;3
Unknown key enable-batch-process for tracker
Unknown key enable-past-frame for tracker
NVDSMETAMUX_CFG_PARSER: Group 'user-configs' ignored
** INFO: <create_primary_gie_bin:147>: gpu-id: 0 in primary-gie group is ignored, only accept in nvinferserver's config
i:0, src_id_num:3
link_streamdemux_to_streammux, srid:0, mux:0
link_streamdemux_to_streammux, srid:1, mux:0
link_streamdemux_to_streammux, srid:2, mux:0
** INFO: <create_primary_gie_bin:147>: gpu-id: 0 in primary-gie group is ignored, only accept in nvinferserver's config
i:1, src_id_num:3
link_streamdemux_to_streammux, srid:1, mux:1
link_streamdemux_to_streammux, srid:2, mux:1
link_streamdemux_to_streammux, srid:3, mux:1
** INFO: <create_primary_gie_bin:147>: gpu-id: 0 in primary-gie group is ignored, only accept in nvinferserver's config
i:2, src_id_num:3
link_streamdemux_to_streammux, srid:1, mux:2
link_streamdemux_to_streammux, srid:2, mux:2
link_streamdemux_to_streammux, srid:3, mux:2
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-7.1/lib/libnvds_nvmultiobjecttracker.so
[NvMultiObjectTracker] Initialized
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-7.1/lib/libnvds_nvmultiobjecttracker.so
[NvMultiObjectTracker] Initialized
0:00:01.270154422  4590 0xaaab0034f410 WARN           nvinferserver gstnvinferserver_impl.cpp:365:validatePluginConfig:<primary_gie> warning: Configuration file batch-size reset to: 4
0:00:01.270288598  4590 0xaaab0034f410 WARN           nvinferserver gstnvinferserver_impl.cpp:371:validatePluginConfig:<primary_gie> warning: Configuration file unique-id reset to: 1
INFO: TrtISBackend id:1 initialized model: trafficcamnet
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-7.1/lib/libnvds_nvmultiobjecttracker.so
[NvMultiObjectTracker] Initialized
0:00:01.900407976  4590 0xaaab0034f410 WARN           nvinferserver gstnvinferserver_impl.cpp:365:validatePluginConfig:<primary_gie> warning: Configuration file batch-size reset to: 4
INFO: TrtISBackend id:3 initialized model: trafficcamnet
0:00:01.970678359  4590 0xaaab0034f410 WARN           nvinferserver gstnvinferserver_impl.cpp:365:validatePluginConfig:<primary_gie> warning: Configuration file batch-size reset to: 4
0:00:01.970757527  4590 0xaaab0034f410 WARN           nvinferserver gstnvinferserver_impl.cpp:371:validatePluginConfig:<primary_gie> warning: Configuration file unique-id reset to: 2
INFO: TrtISBackend id:2 initialized model: peoplenet
Running...
**PERF: 0.00 (0.00)	0.00 (0.00)	0.00 (0.00)	0.00 (0.00)	
Opening in BLOCKING MODE 
Opening in BLOCKING MODE 
Opening in BLOCKING MODE 
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 279 
NvMMLiteOpen : Block : BlockType = 279 
NvMMLiteOpen : Block : BlockType = 279 
NvMMLiteOpen : Block : BlockType = 279 
NvMMLiteBlockCreate : Block : BlockType = 279 
NvMMLiteBlockCreate : Block : BlockType = 279 
NvMMLiteBlockCreate : Block : BlockType = 279 
NvMMLiteBlockCreate : Block : BlockType = 279 
**PERF: 0.00 (0.00)	0.00 (0.00)	0.00 (0.00)	0.00 (0.00)	
Processing frame number = 0	
Processing frame number = 1	
Processing frame number = 2	
Processing frame number = 3	
Processing frame number = 4	
Processing frame number = 5	
Processing frame number = 6	
Processing frame number = 7	
Processing frame number = 8	
Processing frame number = 9	
Processing frame number = 10	
Processing frame number = 11	
Processing frame number = 12	
Processing frame number = 13	
Processing frame number = 14	
Processing frame number = 15	
**PERF: 17.96 (17.12)	17.96 (17.12)	17.96 (17.12)	17.96 (17.12)	
Processing frame number = 16	
Processing frame number = 17	
Processing frame number = 18	
Processing frame number = 19	
Processing frame number = 20	
Processing frame number = 21	
Processing frame number = 22	
Processing frame number = 23	
Processing frame number = 24	
Processing frame number = 25	
Processing frame number = 26	
Processing frame number = 27	
Processing frame number = 28	
Processing frame number = 29	
Processing frame number = 30	
Processing frame number = 31	
Processing frame number = 32	
Processing frame number = 33	
**PERF: 17.68 (17.59)	17.68 (17.59)	17.68 (17.59)	17.68 (17.59)	
Processing frame number = 34	
Processing frame number = 35	
Processing frame number = 36	
Processing frame number = 37	
Processing frame number = 38	
Processing frame number = 39	
Processing frame number = 40	
Processing frame number = 41	
Processing frame number = 42	
Processing frame number = 43	
Processing frame number = 44	
Processing frame number = 45	
Processing frame number = 46	
Processing frame number = 47	
Processing frame number = 48	
Processing frame number = 49	
**PERF: 16.92 (17.03)	16.92 (17.03)	16.92 (17.03)	16.92 (17.03)	
Processing frame number = 50	
Processing frame number = 51	
Processing frame number = 52	
Processing frame number = 53	
Processing frame number = 54	
Processing frame number = 55	
Processing frame number = 56	
Processing frame number = 57	
Processing frame number = 58	
Processing frame number = 59	
Processing frame number = 60	
Processing frame number = 61	
Processing frame number = 62	
Processing frame number = 63	
Processing frame number = 64	
Processing frame number = 65	
Processing frame number = 66	
Processing frame number = 67	
Message arrived
	topic: 'rectitude'
	payload: 'Hello World!'

Message arrived
	topic: 'rectitude'
	payload: 'Hi there testing 123!'

Message arrived
	topic: 'rectitude'
	payload: 'Is anyone listening?'

Message arrived
	topic: 'rectitude'
	payload: 'quit'

Deepstream Quitting
Returned, stopping playback
Processing frame number = 68	
Processing frame number = 69	
Processing frame number = 70	
Deleting pipeline
[NvMultiObjectTracker] De-initialized
[NvMultiObjectTracker] De-initialized
[NvMultiObjectTracker] De-initialized
App run successful
Message arrived
	topic: 'rectitude'
	payload: 'Hello World!'

Message arrived
	topic: 'rectitude'
	payload: 'Hi there testing 123!'

Message arrived
	topic: 'rectitude'
	payload: 'Is anyone listening?'

Message arrived
	topic: 'rectitude'
	payload: 'test'


before run 
src_ids:0;1;2
Unknown key enable-batch-process for tracker
Unknown key enable-past-frame for tracker
src_ids:1;2;3
Unknown key enable-batch-process for tracker
Unknown key enable-past-frame for tracker
src_ids:1;2;3
Unknown key enable-batch-process for tracker
Unknown key enable-past-frame for tracker
** ERROR: <create_multi_source_bin:1511>: Failed to create element 'multi_src_bin'
** ERROR: <create_multi_source_bin:1608>: create_multi_source_bin failed
creating multi source bin failed
Deepstream Quitting
App run successful

atic@ubuntu:/opt/nvidia/deepstream/deepstream-7.1/sources/apps/sample_apps/deepstream_reference_apps/deepstream_parallel_inference_app/tritonclient/sample$

Because the video didn’t finish when you stopped manually.
Could you add the GST_DEBUG=4 in front of your command to open more log info?

I did manage application start/stop in another way. Now settled.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.