How to use the GST-nvdsanalytics plugin in Deepstream-YOLO?

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson)
• DeepStream Version6.3
• JetPack Version5.1.2 (valid for Jetson only)
• TensorRT Version5.1.2
• NVIDIA GPU Driver Version (valid for GPU only)

How to use the GST-nvdsanalytics plugin in Deepstream-YOLO?

I’m currently using deepstream-yolo for road traffic detection,config_infer_primary_yoloV8.txt and deepstream_app_config.txt,configuration parameter,

config_infer_primary_yoloV8.txt

[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-color-format=0
onnx-file=yolov8n.pt.onnx
model-engine-file=model_b1_gpu0_fp32.engine
#int8-calib-file=calib.table
labelfile-path=labels.txt
batch-size=1
network-mode=0
num-detected-classes=80
interval=0
gie-unique-id=1
process-mode=1
network-type=0
cluster-mode=2
maintain-aspect-ratio=1
symmetric-padding=1
#workspace-size=2000
parse-bbox-func-name=NvDsInferParseYolo
#parse-bbox-func-name=NvDsInferParseYoloCuda
custom-lib-path=nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so
engine-create-func-name=NvDsInferYoloCudaEngineGet

[class-attrs-all]
nms-iou-threshold=0.45
pre-cluster-threshold=0.25
topk=300

deepstream_app_config.txt

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
type=3
uri=file:///home/cvai/ultralytics/main/video/LT_cloudy_morning_test_sub.mp4
num-sources=1
cudadec-memtype=0
gpu-id=0


[sink0]
enable=1
type=2
sync=0
gpu-id=0
nvbuf-memory-type=0

[osd]
enable=1
gpu-id=0
border-width=5
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
live-source=0
batch-size=1
batched-push-timeout=40000
width=1280
height=720
enable-padding=0
nvbuf-memory-type=0

[primary-gie]
enable=1
gpu-id=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_yoloV8.txt

[tests]
file-loop=0

Then I use the command line through the introduction of deepstream-yolo: deepstream-app -c deepstream_app_config.txt, execute the command,It worked。

But I want to use the plugin GST-nvdsanalytics in deepstream-yolo for ROI Filtering, Overcrowding Detection, Direction Detection, Line Crossing, and more,

I didn’t see the documentation on how to use it. Do I need to change it
/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-app/
deepstream_app_main.cpp,Then go to the configuration parameter file,

Or can I just go toconfig_infer_primary_yoloV8.txt and deepstream_app_config.txt add plug-ins for functionality?

How exactly do I implement these features in deepstream-yolo?

How can I implement this feature? Who Can Help Me?

please find “The nvdsanalytics plugin can also be used along with reference deepstream-app…” in \opt\nvidia\deepstream\deepstream-7.1\sources\apps\sample_apps\deepstream-nvdsanalytics-test\README

Thank you very much for your reply, refer to your comments, I successfully achieved!

But I have a new problem, which is that I let Deepstream detect the video, and then the output format is RTSP stream, I check my output video via VLC (rtsp://localhost:8554/ds-test) and he plays it successfully.
But what I want to achieve is successful playback on the web side, so I want to make it successfully play on the web side by deploying a MediaMTX server. However, the rtsp video stream output by deepstream seems to contain B-frames, which does not meet the requirements of WebRTC!
I would like to know how to get the output rtsp video stream to not contain B-frames. Which parameter should I configure to fix this?

It has run successfully!

(base) cva@ubuntu:~/ultralytics/DeepStream-Yolo-master$ deepstream-app -c deepstream_app_file_test_config.txt

 *** DeepStream: Launched RTSP Streaming at rtsp://localhost:8554/ds-test ***

gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream/lib/libnvds_nvmultiobjecttracker.so
[NvMultiObjectTracker] Initialized
0:00:05.690593861  4667 0xaaaae2a4a4c0 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1988> [UID = 1]: deserialized trt engine from :/home/cvai/lintao/ultralytics/DeepStream-Yolo-master/model_b1_gpu0_fp32.engine
WARNING: [TRT]: The getMaxBatchSize() function should not be used with an engine built from a network created with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag. This function will always return 1.
INFO: [Implicit Engine Info]: layers num: 2
0   INPUT  kFLOAT input           3x640x640
1   OUTPUT kFLOAT output          8400x6

0:00:05.910875977  4667 0xaaaae2a4a4c0 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2091> [UID = 1]: Use deserialized engine model: /home/cvai/lintao/ultralytics/DeepStream-Yolo-master/model_b1_gpu0_fp32.engine
0:00:05.920582026  4667 0xaaaae2a4a4c0 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/home/cvai/lintao/ultralytics/DeepStream-Yolo-master/config_infer_primary_yoloV8.txt sucessfully

Runtime commands:
	h: Print this help
	q: Quit

	p: Pause
	r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
      To go back to the tiled display, right-click anywhere on the window.


**PERF:  FPS 0 (Avg)
**PERF:  0.00 (0.00)
** INFO: <bus_callback:239>: Pipeline ready

Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NvMMLiteBlockCreate : Block : BlockType = 261
** INFO: <bus_callback:225>: Pipeline running

**PERF:  49.46 (43.91)
**PERF:  58.11 (45.66)
**PERF:  59.91 (47.24)
**PERF:  57.28 (48.11)
**PERF:  41.02 (47.46)
**PERF:  44.42 (47.25)
**PERF:  42.41 (46.92)

Check if the RTSP video contains B frames,It does contain。

ffprobe -v error -show_frames "rtsp://localhost:8554/ds-test" | grep "pict_type=B"

pict_type=B
pict_type=B
pict_type=B
pict_type=B
pict_type=B
pict_type=B
pict_type=B

This is the output when I run the streaming media server

./mediamtx
2025/03/09 18:50:55 INF MediaMTX v1.11.3
2025/03/09 18:50:55 INF configuration loaded from /home/cvai/lintao/ultralytics/mediamtx_v1.11.3_linux_arm64v8/mediamtx.yml
2025/03/09 18:50:55 INF [path deepstream] [RTSP source] started
2025/03/09 18:50:55 INF [RTSP] listener opened on :8555 (TCP), :8000 (UDP/RTP), :8001 (UDP/RTCP)
2025/03/09 18:50:55 INF [RTMP] listener opened on :1935
2025/03/09 18:50:55 INF [HLS] listener opened on :8888
2025/03/09 18:50:55 INF [WebRTC] listener opened on :8889 (HTTP), :8189 (ICE/UDP)
2025/03/09 18:50:55 INF [SRT] listener opened on :8890 (UDP)
2025/03/09 18:50:55 INF [path deepstream] [RTSP source] ready: 1 track (H264)
2025/03/09 18:51:11 WAR [path deepstream] [RTSP source] 3 RTP packets lost
2025/03/09 18:51:24 INF [WebRTC] [session aaa42eee] created by 192.168.101.153:58894
2025/03/09 18:51:25 INF [WebRTC] [session aaa42eee] peer connection established, local candidate: host/udp/127.0.0.1/8189, remote candidate: prflx/udp/192.168.101.153/51810
2025/03/09 18:51:25 INF [WebRTC] [session aaa42eee] is reading from path 'deepstream', 1 track (H264)
2025/03/09 18:51:25 INF [WebRTC] [session aaa42eee] closed: WebRTC doesn't support H264 streams with B-frames

How Do I remove B frames from the RTSP stream output from deepstream? Can it be resolved with configuration parameters? If so, how do I modify the following parameters?

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=1

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
type=2
uri=file:///home/cvai/lintao/ultralytics/main/video/LT_cloudy_morning_sub_output2.mp4
#uri=file:///home/cvai/lintao/ultralytics/main/video/traffic.mp4
num-sources=1
num-extra-surfaces=10
drop-frame-interval=0
nvbuf-memory-type=0
gpu-id=0

[sink0]
#enable=1
#type=2
#sync=0
#gpu-id=0
#nvbuf-memory-type=0

enable=1
type=4
sync=0
source-id=0
gpu-id=0
nvbuf-memory-type=0
rtsp-port=8554
udp-port=5400
codec=1
profile=0
width=1280
height=720
iframeinterval=30
bitrate=2000000
udp-buffer-size=100000
enc-type=1

[tracker]
enable=1
tracker-width=640
tracker-height=640
ll-lib-file=/opt/nvidia/deepstream/deepstream/lib/libnvds_nvmultiobjecttracker.so
ll-config-file=/home/cvai/lintao/ultralytics/DeepStream-Yolo-master/config_tracker_NvDCF_perf.yml
gpu-id=0
display-tracking-id=1
user-meta-pool-size=256

[osd]
enable=1
gpu-id=0
border-width=5
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=1
clock-x-offset=1000
clock-y-offset=30
clock-text-size=15
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
live-source=1
batch-size=1
batched-push-timeout=40000
width=1280
height=720
enable-padding=0
nvbuf-memory-type=0

[primary-gie]
enable=1
gpu-id=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_yoloV8.txt

[tests]
file-loop=0

[nvds-analytics]
#unique-id=0
enable=1
config-file=config_nvdsanalytics_yolov8.txt


profile=0,It’s set to whether this works, but the same problem seems to occur!

From “enc-type=1”, the application is using softeware encoding x264enc. please set bframes=0 for x264enc in create_udpsink_bin of \opt\nvidia\deepstream\deepstream\sources\apps\apps-common\src\deepstream_sink_bin.c.

Thank you for your patient answer!
I followed your path to check the file, and it does exist, but there is no parameter ‘bframes’ in create_udpsink_bin.

Also, there are two folders in /opt/nvidia/deepstream, which one should I modify? I checked deepstream_sink_bin.c in both folders with the same path, and I couldn’t find the parameter ‘bframes’.

I have uploaded /opt/nvidia/deepstream/deepstream/sources/apps/apps-common/src/deepstream_sink_bin.c.

deepstream_sink_bin.txt (30.8 KB)

If I change the code in deepstream_sink_bin.c, do I need to recompile it? How do you do that?

On jetson, it seems that the value of enc-type can only be set to 1!

In create_udpsink_bin, add the code here. then rebuild the deepstream-app according to /opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-app/README.

    //bitrate is in kbits/sec for software encoder x264enc and x265enc
    g_object_set (G_OBJECT (bin->encoder), "bitrate", config->bitrate / 1000,
        NULL);
//new code
    g_object_set (G_OBJECT (bin->encoder), "bframes", 0, NULL); 
//---new code

I think I should change it: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/apps-common/src/deepstream_sink_bin.c.

And /opt/nvidia/deepstream/deepstream/, there seems to be a link below this directory! Although there is also the source code in it

Or should I just change /opt/nvidia/deepstream/deepstream/sources/apps/apps-common/src/deepstream_sink_bin.c.

You really are my God!
I followed your prompt and I modified /opt/nvidia/deepstream/deepstream-6.3 /sources/apps/apps-common/src/deepstream_sink_bin.c. and it compiled! Then you can play it on the web! Thank you very much again! Thank you very much!

Glad to know you fixed it, thanks for the update! If need further support, please open a new one. Thanks