Stream the output displayed on the NVOSD window to an RTSP-Server

• Hardware Platform (Jetson / GPU) = Jetson Xavier Nx
• DeepStream Version = 6.1.1
• JetPack Version (valid for Jetson only) = 35.1.0
• TensorRT Version = 8.4.1.5
• CUDA Version (valid for GPU only) = 11.4
• Issue Type( questions, new requirements, bugs) = Question

Hi,
I am using the python binding of Deepstream to perform detection and tracking on a video.
For detection, I am using the custom YOLOv5 model and for tracking, I am using the Deepstream plugin.
(I am using this repo NVIDIA DeepStream SDK 6.1 / 6.0.1 / 6.0 configuration for YOLO-v5 & YOLO-v7 models · GitHub to perform detection and tracking using the Deepstream pipeline).

I can observe the seamless detection of vehicles with corresponding tracking ID on the display window as shown below.

a911c6d295220f8107cd41558c15976e164ab0d3_2_690x348

My target:
Stream the output displayed on this OSD window to an RTSP-Server.

I tried:
To achieve the target,
I am sending the inferenced-output frames (i.e., ndarray) to Redis and receiving and decoding it in the other end.
However, converting the ndarray to list is slower and it is causing lag in the inferencing.
Also, because in the python binding, the inferencing is running in the thread, the Redis needs to be initialized in the thread itself.

I am looking for a solution to stream my inference output to an RTSP-Server by removing the dependency on Redis.

Looking forward to some insights on this.
Thank you.

You can refer to the link below to learn how to send the stream to a rtsp server.
https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/blob/master/apps/deepstream-test1-rtsp-out/deepstream_test1_rtsp_out.py

1 Like

When you convert the frames into an ndarray, you’re doing a memory copy from GPU to CPU which can be very costly especially for higher resolution frames. This operation should not be done frequently in the buffer probe.

1 Like

As the two issues are all explained, hence this topic is closed. If you need further support, please open a new topic. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.