How to transmitting inference results and object metadata to external in deepstream 6.3

Hi ,
I want the inference results. How do I acheive that?

• Hardware Platform (Jetson / GPU) - jetson orin NX developer kit
• DeepStream Version - 6.3
• JetPack Version (valid for Jetson only) - 5.1.2

rtsp_stream.txt

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720

[source0]
enable=1
type=4
uri=rtsp://admin:Netcon%40123@192.168.10.200:8554/cam/realmonitor?channel=1&subtype=0
num-sources=1
latency=2000

[sink0]
enable=1
type=2
sync=0

[osd]
enable=1
border-width=2
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0

[streammux]
live-source=1
batch-size=1
batched-push-timeout=40000
width=1280
height=720

[primary-gie]
enable=1
model-engine-file=…/…/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
batch-size=1
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=0
gie-unique-id=1
config-file=config_infer_primary_yoloV8.txt

[tests]
file-loop=0

config_infer_primary_yoloV8.txt
[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-color-format=0
custom-network-config=yolov8s.cfg
model-file=yolov8s.wts
model-engine-file=model_b1_gpu0_fp32.engine
#int8-calib-file=calib.table
labelfile-path=labels.txt
batch-size=1
network-mode=0
num-detected-classes=80
interval=0
gie-unique-id=1
process-mode=1
network-type=0
cluster-mode=2
maintain-aspect-ratio=1
symmetric-padding=1
parse-bbox-func-name=NvDsInferParseYolo
custom-lib-path=nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so
engine-create-func-name=NvDsInferYoloCudaEngineGet

[class-attrs-all]
nms-iou-threshold=0.45
pre-cluster-threshold=0.25
topk=300

Kindly help me to how to transmitting inference results and object metadata to external in deepstream 6.3

Sorry for the late reply! All inference results saved in the meta data. DeepStream provides nvmsgconv to convert meta data to Json string and provide nvmsgbroker to send string to brokers. please refer to
opt\nvidia\deepstream\deepstream\sources\apps\sample_apps\deepstream-test5\configs\test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt for how to set for sending brokers.
BTW, nvmsgconv and nvmsgbroker are opensource.

Thank you , I will try and let you know.