Video streaming (over Wi-Fi) method with minimum delay/latency


which protocol to use, to achieve minimum latency streaming a video over Wi-Fi. I tried RTSP as:
Transmitter (AGX):
$ ./test-launch ‘v4l2src device=/dev/video0 ! video/x-raw,format=UYVY,width=1280,height=720 ! nvvidconv ! nvv4l2h264enc ! h264parse ! rtph264pay name=pay0 pt=96’

gst-launch-1.0 uridecodebin uri=rtsp:// ! nvvidconv ! ‘video/x-raw(memory:NVMM), width=640, height=360’ ! nvegltransform ! nveglglessink window-x=1981 window-y=180

and there are around 2-3seconds delay. Could I use some other protocol or some other properties and caps for the gstreamer pipeline to achieve a latency less than 300ms?

Discussion in this topic thread may help:

Please take a look.

Hi @DaneLLL ,

Many thanks.

Are these correct pipelines to test minimum delay:

$ ./test-launch ‘v4l2src device=/dev/video0 ! video/x-raw,format=UYVY,width=1280,height=720 ! nvvidconv ! nvv4l2h264enc maxperf-enable=1 ! h264parse ! rtph264pay name=pay0 pt=96’

gst-launch-1.0 uridecodebin uri=rtsp:// ! nvvidconv ! ‘video/x-raw(memory:NVMM), width=640, height=360’ ! nvegltransform ! nveglglessink

or should I prefer UDP better over the RTSP?

We have tried the RTSP with default camera source:

We set latency=500 to rtspsrc and the result looks fine. So RTSP should be OK, although is is done on r28 release. For r32, we have deprecated nvcamerasrc and omx plugins, but the result should be the same.
Suggest you check source latency. To clarify how much in 2-3 second is from the source. If you have the default camera board, you may also try it as a reference.

Thanks @DaneLLL

Which is the default camera board?

The board is

It can be connected to TX1, TX2, Xavier developer kits.

is this the one? Can not see a link somewhere

The camera board is with TX1/TX2 developer kit. Not included with Xavier devkit. Are you able to try glass-to-glass latency with your camera? To know what the delay time is without setting latency. And set latency=500 for a try.