Lag in RTSP with CSI Camera using Deepstream

Continuing the discussion from Unable to run inference pipeline using csi camera source in deepstream python app:

Hello, I thought I’d raise a issue from a previous topic (please see link above).

Looking at the code produced from zhliunycm2, the RTSP stream on VLC player exhibits heavy lag. I’m wondering whether this is due to the VLC player or the RTSP settings on the deepstream_test python files. Does anyone have any recommendation on what I should be looking at or adjusting in that code to make it run smoother?

I’m using a 4GB Jetson Nano with a Raspberry pi camera v2

Thank You.

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

• Hardware Platform (Jetson / GPU) - Jetson Nano Developer Kit 4GB/Maxwell GPU
• DeepStream Version - 6.0
• JetPack Version (valid for Jetson only) - 4.6.1-b110
• TensorRT Version - 4.6.1-b110
• NVIDIA GPU Driver Version (valid for GPU only) - CUDA driver and runtime version - 10.2
• Issue Type( questions, new requirements, bugs) - Question (Why do I receive lag from the real-time inferencing stream using VLC player under the Deepstream SDK)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) - Essentially I am following the Building Video Applications using Deepstream with Jetson Nano course - I am following tutorial 7 with the DSWebcam.ipynb - The files are: deepstream_test_1_usb.py (changed) and dstest1_pgie_config.txt (unchanged). I’ve only changed the deepstream_test_1_usb.py code to match the one in the link provided at the top of this thread, such that it works with the CSI Raspberry Pi Camera V2.
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description) - The idea is to run resnet10 and perform live inferencing with the jetson nano using the Raspberry Pi Camera V2 but I’m getting quite alot of lag seen by the VLC player, so I’m unsure as to why that happens.

Thanks

What is the lag? Is it the delay between the camera and the vlc player? How did you measure the lag?

The lag is between the camera and vlc player, I noticed this because the number of frames parameter (printed this parameter on Jupyter notebooks) during real-time streaming is larger than the number of frames shown on the VLC player. I’ve not been able to measure it via software (only with a timer) but it’s pretty evident that it larger than 2 seconds.

However, I’ve managed to decrease this by reducing the rtp-caching to 0, but it’s not perfect and thus realized VLC player cannot achieve near-zero latency.

Since this is the case, what do you recommend I use to stream the inference on the jetson nano device directly rather than using RTP? If there is another way, please could you direct me to the documentation so I am able to change the RTP blocks in the pipeline with ones that allow me to stream on the desktop of the nano.

Thank You.

Network is one of the way to transfer data between two devices. There is no protocol to guarantee no delay on transferring.

Understood, but if I don’t want to transfer the live stream between two devices and only want to stream on the Jetson Nano, would I still need to use RTSP? Or are there other methods to make the video display locally on the Jetson device?

Thanks.

No, you don’t need to output as RTSP stream. You can display on screen(if there is monitor connected to your board), you can save as a video file, you can just discard the output , …

We have samples of display the output. To save files or do other things, please study gstreamer by yourself. It has nothing to do with deepstream. Please google by yourself.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.