Detection data isn't being sent to RabbitMQ server

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) RTX 6000
• DeepStream Version DS 6.3 (nvcr.io/nvidia/deepstream:6.3-gc-triton-devel)
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.5.3-1+cuda11.8
• NVIDIA GPU Driver Version (valid for GPU only) Driver Version: 535.113.01
• Issue Type( questions, new requirements, bugs) Message Queue is empty on the rabbitmq management portal
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) I’m using a yolov8 model (GitHub - marcoslucianops/DeepStream-Yolo: NVIDIA DeepStream SDK 6.3 / 6.2 / 6.1.1 / 6.1 / 6.0.1 / 6.0 / 5.1 implementation for YOLO models)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

deepstream-config.txt


[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
# Type - 1=CameraV4L2 2=URI 3=MultiURI
type=2
uri=file:///models/edgeai-cfg/cam01.mp4
num-sources=1
gpu-id=0
cudadec-memtype=0
#camera-width=640
#camera-height=480
#camera-fps-n=30
#camera-fps-d=1
#camera-v4l2-dev-node=0


[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=3
sync=0
source-id=0
gpu-id=0
nvbuf-memory-type=0
codec=1
output-file=outputs/office5.mp4
container=1
bitrate=2500000
nvbuf-memory-type=0

[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming 5=nvdrmvideosink
type=4
sync=0
source-id=0
gpu-id=0
#1=h264 2=h265
codec=1
bitrate=4000000
#encoder type 0=Hardware 1=Software
enc-type=0
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
# set profile only for hw encoder, sw encoder selects profile based on sw-preset
profile=0
rtsp-port=1888
udp-port=5400

[sink2]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=msgconv_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM   - Custom schema payload
msg-conv-payload-type=0
#msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-6.3/lib/libnvds_kafka_proto.so
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-6.3/lib/libnvds_amqp_proto.so
#Provide your msg-broker-conn-str here
#msg-broker-conn-str=localhost;29092;test
#msg-broker-conn-str=localhost;5672;1888
#topic=
msg-broker-config=rmq_cfg.txt
iframeinterval=30

[osd]
enable=1
gpu-id=0
border-width=5
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
live-source=0
batch-size=1
batched-push-timeout=40000
width=1920
height=1080
enable-padding=0
nvbuf-memory-type=0

[primary-gie]
enable=1
gpu-id=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_yoloV8.txt

[tracker]
enable=1
# For NvDCF and NvDeepSORT tracker, tracker-width and tracker-height must be a multiple of 32, respectively
tracker-width=960
tracker-height=544
ll-lib-file=/opt/nvidia/deepstream/deepstream/lib/libnvds_nvmultiobjecttracker.so
# ll-config-file required to set different tracker types
# ll-config-file=config_tracker_IOU.yml
# ll-config-file=config_tracker_NvSORT.yml
ll-config-file=config_tracker_NvDCF_perf.yml
# ll-config-file=config_tracker_NvDCF_accuracy.yml
# ll-config-file=config_tracker_NvDeepSORT.yml
gpu-id=0
display-tracking-id=1

[tests]
file-loop=0

msgconv_config.txt

[camera0]
name=cam01
location=Office
description=Sample Setup

rmq_config.txt

[message-broker]
username=guest
password=guest
hostname=localhost
exchange=edge_exchange
topic=edge.routing_key
port=5672
amqp-framesize = 131072

This is how I’m running the deepstream docker container:

sudo docker run --runtime nvidia -it --rm --network host \
    -v /tmp/.X11-unix/:/tmp/.X11-unix \
    -v /tmp/argus_socket:/tmp/argus_socket \
    -v ~/my_apps:/dli/task/my_apps \
    -v $PWD/models:/models \
    --device /dev/video0 \
    nvcr.io/nvidia/deepstream:6.3-gc-triton-devel

This is how I’m running the rabbitmq container

version: '3'
services:
  rabbitmq:
    image: rabbitmq:3.12-management
    container_name: analytics-server
    ports:
      - 15672:15672
      - 5672:5672
      - 5671:5671
networks:
  messaging:
    driver: bridge

This is how I’m setting up the exchanges and queues

rabbitmqadmin declare exchange name=edge_exchange type=topic
rabbitmqadmin declare queue name=edge_queue durable=true
rabbitmqadmin declare binding source=edge_exchange destination_type="queue" destination=edge_queue routing_key=edge.routing_key
  1. if using filesink or eglsink, can you see the bbox? wondering if the detection works.
  2. if using broker, can you share more client logs?
  3. you can test the code in /opt/nvidia/deepstream/deepstream/sources/libs/amqp_protocol_adaptor to check if the configuration works.

Yes the detections are visible. I’m having two sinks - one for rtsp and one for saving the outputs, both are working

I’ll attach a file below

I tried this and it didn’t work for me.

ds_log.txt (19.3 KB)
these are the deepstream logs

sorry for the late reply. from the configuration, seems you did set msg-broker-conn-str? nvmsgconv plugin and lowlevel are opensource. please add log ingenerate_event_message/generate_dsmeta_message to check if these functions are called. there functions are used to generate Json string.