DeepStream output file metadata and metrics

• Hardware Platform (Jetson / GPU) NVIDIA L4-12Q
• DeepStream Version 7.1
• TensorRT Version 10.3.0.26
• NVIDIA GPU Driver Version (valid for GPU only) 550.54.14

Hello,

I’m currently using DeepStream 7.1 with a deepstream_app configuration file (deepstream_appv8_config.txt).

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5

[tiled-display]
enable=1
rows=1
columns=1
width=640
height=640
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
type=4
uri=rtsp:/url:port/mystream
num-sources=1
gpu-id=0
cudadec-memtype=0
latency=200

[sink0]
enable=1
type=1
sync=1
gpu-id=0
nvbuf-memory-type=0

[sink2]
enable=1
type=4
gpu-id=0
rtsp-port=8554
enc-type=1
#udp-port=5400
bitrate=8000000
profile=0
iframeinterval=10
nvbuf-memory-type=0

[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
live-source=1
batch-size=1
batched-push-timeout=40000
width=640
height=640
enable-padding=0
nvbuf-memory-type=0

[primary-gie]
enable=1
gpu-id=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_yolov8.txt
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1

The pipeline is working correctly and outputs an RTSP stream as expected.

However, I would like to enable the output of detection metadata and metrics, such as:

  • Bounding box information
  • Object coordinates
  • Class IDs, confidence scores
  • Any other inference-related data

My goals are:

  1. To generate a file (e.g., JSON, CSV, etc.) during runtime containing these detection results.
  2. Nice to Have: publish these metadata outputs to a message broker, such as MQTT or Kafka.

Could someone please clarify:

  • What blocks (e.g., [message-converter], [message-broker], [tracker], etc.) and parameters I need to add to the config file to enable this?
  • Whether it’s possible to produce both a local file and publish metadata to MQTT/Kafka at the same time using deepstream-app?
  • Do I need to write a custom probe function or can this be configured entirely via the config file?

Thanks in advance for your help!

Best regards,

About the msgbroker and msgconvert, you can refer to our sample sources\apps\sample_apps\deepstream-test5\configs.
About the “produce in a local file” issue, you can just set the gie-kitti-output-dir=streamscl in the config file.

Hello,
I followed the guide that you provided and I am now able to produce messages on kafka topics.
It is not clear how to handle the types of output posted on the topic.

I am sending you an example of the output that I am getting:

kafka:

{
“version”: “4.0”,
“id”: “150”,
@timestamp”: “2025-06-26T08:23:05.368Z”,
“sensorId”: “0”,
“objects”: [
“18446744073709551615|0.871582|38.8585|636.922|598.878|Defected”
]
}

I would like to be able to extract all the metadata included in the files created in the folder of this parameter “gie-kitti-output-dir” and publish them to kafka

example of output in gie-kitti-output-dir:

Defected 0.0 0 0.0 49.589722 41.170227 640.000000 603.489441 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.823895

I currently composed the file config like this:

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
gie-kitti-output-dir=/opt/nvidia/deepstream/deepstream-7.1/yolo/logs

[tiled-display]
enable=1
rows=1
columns=1
width=640
height=640
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
#type=3
#uri=udp://127.0.0.1:5000
type=4
uri=rtsp://url:port/mystream
num-sources=1
gpu-id=0
cudadec-memtype=0
latency=200

[sink0]
enable=1
type=1
sync=1
gpu-id=0
nvbuf-memory-type=0

[sink2]
enable=1
type=4
gpu-id=0
rtsp-port=8554
enc-type=1
#udp-port=5400
bitrate=8000000
profile=0
iframeinterval=10
nvbuf-memory-type=0

[sink3]
enable=1
type=6
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_kafka_proto.so
msg-broker-conn-str=url;port
topic=detections
msg-conv-config=config_msg_kafka.txt
msg-broker-comp-id=1
msg-conv-comp-id=1
msg-conv-payload-type=1
new-api=1
msg-conv-msg2p-new-api=1

config_msg_kafka.txt:

[msg-conv-msg2p]
enable=1
payload-type=1
msg2p-include-objects=1
msg2p-include-bbox=1
bbox-min-width=1
bbox-min-height=1
frame-interval=1
msg2p-include-extended=1

Could you please help me?

Thanks @yuweiw

You need to modify our source code to send all the data. You can refer to the https://forums.developer.nvidia.com/t/how-to-add-custom-data-to-analytics-or-send-data-via-kafka-inside-deep-stream/247944.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.