How to get output of video using RTSP

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version 6.3
• JetPack Version (valid for Jetson only) 5.1.3
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)

• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

Hello, I’m just accomplished to watch the output of video with bbox using RTSP protocol using ethernet cam and nvidia’s sample video.
But my next step is getting this video(with bbox) to see other player(VLC player or opencv).

Can you give me any advise for this target?

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

  1. which sample are you testing or referring to? do you mean using rtsp source local file as input source, you can see the output video with bboxes?
  2. what do you mean about “getting this video(with bbox) to see other player”? could you share the media pipeline and usage case? Thanks!

Thanks for your quick respond.

I’m rephrasing my question.

  1. Camera (connected with ethernet) is now sending the video to nvidia orin using RTSP protocol.
  2. At Orin, when I run the ‘deepstream-app -c /source4…’, it shows the live video with bbox.
  3. At Orin, I’m planning to input a video showing with bbox into another system like ROS.
    How can I input the video with bbox into another system?
  1. you set in type=4 in [sinkx] of /source4…. deepstream-app will set up a rtsp server. the thirdparty system can pull the rtsp stream generated by deeptream-app.
  2. if you want to use other protocol. deepstream-app code is opensource. you can modify create_sink_bin of opt\nvidia\deepstream\deepstream\sources\apps\apps-common\src\deepstream_sink_bin.c to customize.

Sorry for asking a lot…

About your ans 1, is there any way to see the rtsp stream generated by deeptream-app with different player tool(VLC etc.)??

*** DeepStream: Launched RTSP Streaming at rtsp://localhost:8554/ds-test ***
As the log shown, after deepstream started, the app will print the rtsp address. the port is configurable in [sinkx] . any player which supports rtsp protocol can play rtsp address.

Hello,
Im trying to fast the RTSP video streaming using ‘sync’, because of 2 secs delay occured.
But the manual mentioned that when I use the RTSP, sync=1(Synchronously) is unavailable.

Is there any advise to fast the RTSP video streaming??

  1. if using nveglglessink, does the output video have delay?
  2. if no, the issue is related to rtspserver and player. to rule out the network issue, if playing the output rtsp on the machine which is running deepsteream, does the player still have 2 secs delay?
  1. There’s no nveglglessink at my deepstream-6.3.
  2. I’m using the deepstream-6.3 (command : deepsteam-app -c ./source4_1080p_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt’.
  3. When I execute the command it displays the camera video with bbox of the object. But there are 2secs delay.
  1. did you add latency in soruce setting?
  2. no narrow down this issue, you can only enable streammux to check if delay issue still exists.
  3. please refer to this faq to measure the latency.
  1. I didn’t set the latency.
  2. When I play the default video(sample_1080p_h264.mp4) it doesn’t happened, but when I set the RTSP service [source0 : type = 4], it has a 3sec delays.

Hello, Can you reply to the below question?
Thanks!

  1. could you share the source4_1080p_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt?
  2. does the following cmd still have 3 seconds deley?
gst-launch-1.0 rtspsrc location=rtsp://xxx ! rtph264depay ! h264parse ! nvv4l2decoder ! nv3dsink
  1. please refer to my last comment. did you check the point2 and point3?
  1. file :
    source4_1080p_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt (6.0 KB)

  2. No delay when I use the gst-launch. The delay only happens when using the deepstream - resnet_tracker(above file).

  3. I’ll check it and notice you soon

  1. I just run the command and still have 2~3 seconds delay.
    (You can compare with the left walking people and the bus)


  2. I just check the latency. Here is my screenshot.

  1. from the screenshot, the latency of whole pipeline is 144ms, which is minimal…
  2. about “I just run the command and still have 2~3 seconds delay.”, Yes, I can see the right has about some delay compared the left. is the left the result of “deepstream-app -c source4_1080p_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt”? what is the start command-line of the right? was the right play the RTSP url of Camera or generated by the left?
  3. if the right was playing the rtsp url generated by the left? did you try other players?

Dear Fanzh,

Thanks for considering my questions.
I manage this problem, by mentioning the ‘latency=0’ in my source4…txt.

Thanks for the update!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.