Better way to visualise rtsp stream?

Hi,

I’ve managed to set up a rtsp stream using DeepStream Python App. However, when visualise it using VLC, i got roughly 700ms latency. The pipeline is basically capture the image, convert to nvidia memory, encode to udpsink, then run rtsp server. Just wondering if there’s any other way we can visualise rtsp stream? our user case requires very low latency, this 700ms is not really acceptable for us… Thanks!

Hi,
You may try to run a gstreamer pipeline. On x86PC, you can try

gst-launch-1.0 rtspsrc ! rtph264depay ! h264parse ! avdec_h264 ! xvimagesink sync=false

thanks for the quick reply. sorry i should have made it clearer. What we need is to stream a usb camera and send it though an ethernet cable to a monitor, the monitor doesn’t have capability to run command/script. So would be like a normal screen to see an IP camera…

Hi,
Usually we connect a monitor with a hardware device which can do h264/h265 decoding. For example, if the PC is with x86 CPU + NVIDIA GPU, you may install certain packages to enable software or hardware decoding. If it is only a monitor, it may not be able to decode the h264/h265 stream. Not quite understand your usecase. Please share more information.

Hi,

We have a surveillance monitor which can show IP camera with RTSP address. Now we want to mimic that IP camera by a Jetson Nano gstreamer RTSP stream, just wondering if it’s possible to achieve?

Hi,
You may try test-launch to launch a RTSP server. Please check Jetson Nano FAQ
Q: Is there any example of running RTSP streaming?

If the DeepStream pipeline can be launched through gst-launch-1.0, you can apply it to test-launch. Several posts about DeepStream pipeline: