No data using Deepstream-app using sink option 6 for msg broker/conv

I have been able to run deepstream test4 app to show working msg broker for amqp. When I try and use sink 6 within the main deepstream-app it runs but no data is shown with the broker.

Sink settings:
[sink2]
enable=1
type=6
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_amqp_proto.so
msg-broker-config=/home/ddi/Desktop/deepstream_sdk_v4.0.1_jetson/sources/apps/sample_apps/deepstream-test4/cfg_amqp.txt
msg-broker-conn-str=localhost;5672;guest
msg-conv-config=/home/ddi/msg_conv_config.txt
msg-conv-payload-type=0

Broker config settings:
[message-broker]
password = guest
#optional
hostname = localhost
username = guest
port = 5672
exchange = test2
topic = topicname

Message conv config:
[sensor0]
enable=1
type=Camera
id=CAMERA_ID

Am I missing something? These are the same settings that work with test app 4. Thanks for your help

deepstream-app does not support message broker, you can try test5.

i use test5-app

sudo rabbitmqctl list_queues
Listing queues
myqueue	3505

the app turly send the message, but how can i debug the msgbrokcer?
i add some code in gstnvmsgbroker.c

printf("sendMsg");
  GST_DEBUG_OBJECT(self,"start send data");
GST_DEBUG_OBJECT (self, "connStr is :%s" ,self->connStr);
    GST_DEBUG_OBJECT (self, "config File is :%s" ,self->configFile);

i have make the file and make install
but there is no output. can you help me ?

HI
Please follow test4 README to enable logging

  1. Enable logging:
    Go through the README to setup & enable logs for the messaging libraries(kafka, azure, amqp)
    $ cat …/…/…/tools/nvds_logger/README

or you can run with GST_DEBUG=5 when run sample putting at the beginnining of your command to get more
debug log.

What changes can I make on deepstream-app to support messaging using amqp. I want to send a string.

you can refer test4 app code.

Is this still not supported? I am having the same problems

When will it be supported?

You can refer to test5 sample, which builds on top of the deepstream-app sample to demonstrate how to:

  • Use “nvmsgconv” and “nvmsgbroker” plugins in the pipeline.
  • Create NVDS_META_EVENT_MSG type of meta and attach to buffer.
  • Use NVDS_META_EVENT_MSG for different types of objects e.g. vehicle, person etc.
  • Provide copy / free functions if meta data is extended through “extMsg” field.

I have used this and tried to integrate this pipeline to the YoloV3 Source. Kafka broker seems to start and send message but they dont reach my topic.

I have already opened a topic with the exact problem. Could you have a look at it?

This is my problem-topic:
https://devtalk.nvidia.com/default/topic/1072838/deepstream-sdk/workflow-to-combine-yolov3-sample-and-output-data-to-kafka/post/5436104/#5436104

hi,
is this still the case for ds-5.1?

Hi a7med.hish,

Please help to open a new topic with more details of your issue. Thanks