Question about RTSP Server and client

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version : 6.3
• JetPack Version (valid for Jetson only) : 5.1.3
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

Hello, I’m now studying about the RTSP communication and planning to make the flow like below.

Camera(RTSP Server) → Nvidia Deepstream:Orin(RTSP Client)
Nvidia Deepstream:Orin(RTSP Server) → Our System(RTSP Client)

Question

  1. Is it available to set deepstream settings simutaneously RTSP Server[sink2] and Client[source0] with using ‘source4_1080p_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt’?

  2. Is it available to send the Cam data, Object tracker, bbox info from Orin to another system using RTSP communication?

Thank you,

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Yes. The RTSP source can be set to RTSP when you set “type=4” and “uri=rtsp://xxxxxxxxxx” in source group. DeepStream Reference Application - deepstream-app — DeepStream documentation 6.4 documentation
The RTSP output can be set by “type=4” and the corresponding configurations in sink group DeepStream Reference Application - deepstream-app — DeepStream documentation 6.4 documentation

Do you mean to send bbox coordinates, tracking ids or just to send the video with bboxes drawing on the video?

I want to receive both of the bboxes drawing on the video and BBox, Tracker ids info.

The RTSP can’t transfer data like bboxes coordinates according to the protocol RFC 2326 - Real Time Streaming Protocol (RTSP) (ietf.org). If you want to customize RTSP payload, please develop the protocol and implement by yourself.

DeepStream supports to send the video with bboxes drawing on it. There is sample in /opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-app, please refer to DeepStream Reference Application - deepstream-app — DeepStream documentation 6.4 documentation

DeepStream also supports to send objects bbox coordinates to the cloud server through message broker(Kafka, mqtt,…) Gst-nvmsgbroker — DeepStream documentation 6.4 documentation

Thank you for answering my question. I have an additional question
My ultimate goal is to verify the camera’s data, such as video (images) and object recognition information, via RTSP communication.

  1. Then which data (related to camera) can I see using the RTSP protocol?

  2. Is there any way to see the Deepstream’s(Video with bboxes drawing on it) output?

  3. My ‘source4_1080p_dec_infer-resnet_tracker…’ file doesn’t mention ‘track-output-dir’ at all. Could you let me know if specifying the ‘track-output-dir’ code would allow me to get the tracker information via RTSP communication?

The RTSP protocol supports video and audio payloads. RTP payload formats - Wikipedia

Yes. DeepStream support to output RTSP video streams with bboxes and other things drawing on the videos. If you are using deepstream-app sample application, you can configure the output with the type 4 of sink group. DeepStream Reference Application - deepstream-app — DeepStream documentation 6.4 documentation

Please check the explanation of the different “track-output-dir” configurations. DeepStream Reference Application - deepstream-app — DeepStream documentation 6.4 documentation

Please pay attention that DeepStream is a SDK which includes lots of APIs, sample template plugins and sample applications.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.