Using this config below I’m trying to send two stream info to two separate topics but topics doesnt receive any messages. Without msg-broker-comp-id both topics receives two streams messages. Want to send first stream mesaages to first topic, second stream messages to second topic. Working on deepstreamtest5 with custom config file.
[message-converter]
enable=1
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=1
msg-conv-msg2p-new-api=1
msg-conv-frame-interval=5 #Name of library having custom implementation.
msg-conv-msg2p-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_msgconv.so #Id of component in case only selected message to parse.
#msg-conv-comp-id=
Also edited sources/apps/sample_apps/deepstream-test5/deepstream_test5_app_main.c generate_event_msg_meta function. Added this lines
if (stream_id == 0) {
meta->componentId = 1;
} else if (stream_id == 1) {
meta->componentId = 2;
}
and run make command. But anything changed. I also wonder how can I see stream_id’s actual value.
I have added msg-conv-comp-id=1 to [message-converter]. Now sink0’s topic receives all streams messages. Want to send first stream messages to first topic, second stream messages to second topic. I know I need to fill componentId field of NvDsEventMsgMeta structure to separate topics but the changes I made in deepstream_test5_app_main.c doesn’t change anything. I have added g_print(), g_message, printf() lines to see stream_id, class_id or any string. Exported GST_DEBUG=X (Tried all levels). Nothing shown on console. I feel like it always run original version. Because of this i can’t set componentID. I run make after the changes.
To reproduce issue:
Config file: multi_topic.txt
[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl
[tiled-display]
enable=0
rows=2
columns=5
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0
#Set to 1 to automatically tile in Square Grid
square-seq-grid=1
#Note: [source-list] now support REST Server with use-nvmultiurisrcbin=1
[source-list]
num-source-bins=0
use-nvmultiurisrcbin=1
#To display stream name in FPS log, set stream-name-display=1
stream-name-display=1
#sensor-id-list vector is one to one mapped with the uri-list
#identifies each sensor by a unique ID
#sensor-id-list=UniqueSensorId1;UniqueSensorId2
#Optional sensor-name-list vector is one to one mapped with the uri-list
#sensor-name-list=UniqueSensorName1;UniqueSensorName2
max-batch-size=10 # Maximum number of streams can added to pipeline
http-ip=localhost
http-port=9000
#Set low latency mode for bitstreams having I and IPPP frames on decoder
#low-latency-mode=0
#sgie batch size is number of sources * fair fraction of number of objects detected per frame per source
#the fair fraction of number of object detected is assumed to be 4
sgie-batch-size=40
#Set the below key to keep the application running at all times
[source-attr-all]
enable=1
type=3 #1: Camera (V4L2) 2: URI 3: MultiURI 4: RTSP 5: Camera (CSI) (Jetson only)
num-sources=0 #Valid only when type=3
gpu-id=0
cudadec-memtype=0
#drop-frame-interval=5
#latency=100
#rtsp-reconnect-interval-sec=10
#Limit the rtsp reconnection attempts
#rtsp-reconnect-attempts=4
[streammux]
gpu-id=0
live-source=1
batch-size=6
batched-push-timeout=40000
width=1920
height=1080
enable-padding=0
nvbuf-memory-type=0
[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvdrmvideosink 6=MsgConvBroker
type=6
disable-msgconv = 1
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_amqp_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=rabbit.app_network;5672;guest;guest
topic=annotationData.65705714ec359113d519d892.6506f4a4-91de-4d33-bbfd-4ab16db91952.66fbd771f01fe320cb3da26b
msg-broker-comp-id=1
msg-conv-comp-id=1
#Optional:
msg-broker-config=/opt/nvidia/deepstream/deepstream/sources/libs/amqp_protocol_adaptor/cfg_amqp.txt
#new-api=0
#(0) Use message adapter library api's
#(1) Use new msgbroker library api's
[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvdrmvideosink 6=MsgConvBroker
type=6
disable-msgconv = 1
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_amqp_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=rabbit.app_network;5672;guest;guest
topic=deepstream.65705714ec359113d519d892.6506f4a4-91de-4d33-bbfd-4ab16db91952.66fbd771f01fe320cb3da26b
msg-broker-comp-id=2
msg-conv-comp-id=1
#Optional:
msg-broker-config=/opt/nvidia/deepstream/deepstream/sources/libs/amqp_protocol_adaptor/cfg_amqp.txt
# sink type = 6 by default creates msg converter + broker.
# To use multiple brokers use this group for converter and use
# sink type = 6 with disable-msgconv = 1
[message-converter]
enable=1
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=1
msg-conv-msg2p-new-api=1
msg-conv-frame-interval=5
# Name of library having custom implementation.
msg-conv-msg2p-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_msgconv.so
# Id of component in case only selected message to parse.
msg-conv-comp-id=1
# Configure this group to enable cloud message consumer.
[message-consumer0]
enable=0
proto-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_kafka_proto.so
conn-str=<host>;<port>
config-file=<broker config file e.g. cfg_kafka.txt>
subscribe-topic-list=<topic1>;<topic2>;<topicN>
# Use this option if message has sensor name as id instead of index (0,1,2 etc.).
#sensor-list-file=dstest5_msgconv_sample_config.txt
[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Arial
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0
[primary-gie]
enable=1
#interval=5
gpu-id=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=/opt/nvidia/deepstream/deepstream/DeepStream-Yolo/config_infer_vehicle.txt
#infer-raw-output-dir=../../../../../samples/primary_detector_raw_output/
[tracker]
enable=1
# For NvDCF and NvDeepSORT tracker, tracker-width and tracker-height must be a multiple of 32, respectively
tracker-width=960
tracker-height=544
ll-lib-file=/opt/nvidia/deepstream/deepstream/lib/libnvds_nvmultiobjecttracker.so
# ll-config-file required to set different tracker types
# ll-config-file=../../../../../samples/configs/deepstream-app/config_tracker_IOU.yml
# ll-config-file=../../../../../samples/configs/deepstream-app/config_tracker_NvSORT.yml
ll-config-file=../../../../../samples/configs/deepstream-app/config_tracker_NvDCF_perf.yml
# ll-config-file=../../../../../samples/configs/deepstream-app/config_tracker_NvDCF_accuracy.yml
# ll-config-file=../../../../../samples/configs/deepstream-app/config_tracker_NvDeepSORT.yml
gpu-id=0
display-tracking-id=1
Running this config file with deepstream-test5-app -c configs/multi_topic.txt
The msg-conv-comp-id is set for the nvmsgconv and the msg-broker-comp-id is set for the nvmsgbroker.
About the details, you can refer to our source code directly: sources\gst-plugins\gst-nvmsgbroker\gstnvmsgbroker.cpp and sources\gst-plugins\gst-nvmsgconv\gstnvmsgconv.cpp.
You can try to set the parameters like:
[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvdrmvideosink 6=MsgConvBroker
type=6
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_amqp_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=rabbit.app_network;5672;guest;guest
topic=annotationData.65705714ec359113d519d892.6506f4a4-91de-4d33-bbfd-4ab16db91952.66fbd771f01fe320cb3da26b
msg-broker-comp-id=1
msg-conv-comp-id=1
#Optional:
msg-broker-config=/opt/nvidia/deepstream/deepstream/sources/libs/amqp_protocol_adaptor/cfg_amqp.txt
#new-api=0
#(0) Use message adapter library api's
#(1) Use new msgbroker library api's
[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvdrmvideosink 6=MsgConvBroker
type=6
disable-msgconv = 1
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_amqp_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=rabbit.app_network;5672;guest;guest
topic=deepstream.65705714ec359113d519d892.6506f4a4-91de-4d33-bbfd-4ab16db91952.66fbd771f01fe320cb3da26b
msg-broker-comp-id=2
msg-conv-comp-id=2
Yes. Still pipeline fails because of NvDsMsg2pCtx* nvds_msg2p_ctx_create(const gchar*, NvDsPayloadType): assertion 'file' failed. To fix I have added msg-conv-payload-type=1. This time pipeline runs but topics doesn’t get any messages. I have read the all similar topics but none of them working for me. Can you share your full config file and how you set componentId step by step with releted code blocks? Thanks
Thank you for your answer. Finally able to send messages to topics without duplicates. Problem was not explicitly specifying the current directory of deepstream-test5-app with ./
I’m using msg-conv-msg2p-new-api=1. It uses NvDsFrameMeta intead of NvDsEventMsgMeta. So I need to set componentId for NvDsFrameMeta structre. But NvDsFrameMeta has no property named componentId. Which steps i should follow to set componentId for NvDsFrameMeta?
If you want to use NvDsFrameMeta , you can try to use the NvDsUserMetaList *frame_user_meta_list; to record your own values.
You also need to modify our source code in the sources\gst-plugins\gst-nvmsgbroker\gstnvmsgbroker.cpp and sources\gst-plugins\gst-nvmsgconv\gstnvmsgconv.cpp to filter the message. You can refer to our source code briefly first and then make changes to that.