Gstreamer TCPserversink 2-3 seconds latency

Hello,
I’m trying to send a video stream with tcp, but I get 2-3 seconds of latency, and i’m looking to reduce it as much as possible. The received stream sometimes stop on a gray image and then receive a burst of frames in accelerate.
We have a Jetson NX Xavier devkit on Jetpack 4.5.1. I’m using it with nvpmodel -m 2 and jetson_clocks --fan.
I enabled VIC to run in max clock as described here

Command v4l2-ctl --list-formats-ext -d /dev/video0 give :
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: ‘YUYV’
Name : YUYV 4:2:2
Size: Discrete 1920x1080
Interval: Discrete 0.017s (60.000 fps)

Here is my pipeline :
> std::string gst_out = "appsrc ! videoconvert ! queue ! nvvidconv ! nvv4l2h265enc maxperf-enable=1 ! h265parse ! matroskamux ! tcpserversink port=AAAA host=XXX.XXX.XXX.XXX sync=false async=false";

And my codec :
cv::VideoWriter out(gst_out, cv::VideoWriter::fourcc('F', 'M', 'P', '4'), double(fps), cv::Size(int(w), int(h)));
I tried to remove videoconvert with video/x-raw, format=YUYV, width=1920, height=1080, framerate=60/1, but it doesn’t work with this setup.

Videostream check with fpsdisplaysink

gst-launch-1.0 v4l2src ! videoconvert ! fpsdisplaysink text-overlay=0 video-sink=fakesink sink=0 -v
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)60/1, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)60/1, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)60/1, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)60/1, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)60/1, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)60/1, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 32, dropped: 0, current: 61,95, average: 61,95
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 63, dropped: 0, current: 60,01, average: 60,98
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 93, dropped: 0, current: 60,00, average: 60,66
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 124, dropped: 0, current: 60,00, average: 60,49
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 154, dropped: 0, current: 60,00, average: 60,40
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 185, dropped: 0, current: 60,00, average: 60,33
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 216, dropped: 0, current: 60,01, average: 60,28
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 247, dropped: 0, current: 59,99, average: 60,25
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 278, dropped: 0, current: 60,00, average: 60,22
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 309, dropped: 0, current: 60,00, average: 60,20
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 339, dropped: 0, current: 60,00, average: 60,18
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 370, dropped: 0, current: 60,00, average: 60,16
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 401, dropped: 0, current: 60,00, average: 60,15
^Chandling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 0:00:07.000306499
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Thank you !

Hi,
Please adjust the two properties in nvv4l2h265enc and try:

  iframeinterval      : Encoding Intra Frame occurance frequency
                        flags: readable, writable, changeable only in NULL or READY state
                        Unsigned Integer. Range: 0 - 4294967295 Default: 30
  idrinterval         : Encoding IDR Frame occurance frequency
                        flags: readable, writable, changeable only in NULL or READY state
                        Unsigned Integer. Range: 0 - 4294967295 Default: 256

The default setting may not be good for streaming, may try idrinterval=30 or15.

Hi DaneLLL,
I tried multiples value for idrinterval (15, 30) and iframeinterval (5, 10, 20, 50, 100, 200, 300, 600), but nothing changed.
Here is my code :
gstreamer_tcp.cpp (2.7 KB)

Where can I see the documentation about all the Gstreamer nvidia properties like idrinterval ? I can’t find them in the ACCELERATED GSTREAMER USER GUIDE ?
Here is a sample of what I am getting :

Don’t you think it can also come from the videoconvert plug-in ?

Hi,
Are you able to try UDP streaming? We have tried UDP in TX2 NX as server and x86 PC as client.
Server command:

$ gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1280,height=720' ! tee name=t ! nvv4l2h264enc insert-sps-pps=1 idrinterval=15 ! h264parse ! rtph264pay ! udpsink host=10.19.106.10 port=5000 sync=0 t. ! queue ! nvegltransform ! nveglglessink sync=0

Client command:

$ gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! avdec_h264 ! xvimagesink sync=0

The result(left is TX2 NX and right is x86 PC):

Could you try server commands in the cases:

  1. gst-launch-1.0 videotestsrc in UDP
  2. gst-launch-1.0 videotestsrc in TCP
  3. gst-launch-1.0 OpenCV in UDP

See if we can clarify where the latency is from by trying these cases.

And please run gst-inspect-1.0 nvv4l2h265enc to get all properties.

Hi DaneLLL,
I am not able to stream UDP with these commands, I get the same screen as you when i’m streaming from the NX Xavier, but my Ubuntu 18.04 x86 PC doesn’t get the videotest :

gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! avdec_h264 ! xvimagesink sync=0
Définition du pipeline à PAUSED...
Le pipeline est actif et n’a pas besoin de phase PREROLL…
Passage du pipeline à la phase PLAYING…
New clock: GstSystemClock
^Chandling interrupt.
Interruption : arrêt du pipeline…
Execution ended after 0:00:17.859331593
Définition du pipeline à PAUSED...
Définition du pipeline à READY (prêt)…
Définition du pipeline à NULL…
Libération du pipeline…

And no image is shown.

Also gst-launch-1.0 nvv4l2h265enc give me this on my NX Xavier :

$ gst-launch-1.0 nvv4l2h265enc
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE
Opening in BLOCKING MODE 
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:07.792176179
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Hi,
The command was not correct and I have fixed it. It should be gst-inspect-1.0.

For streaming to your host PC, do you change host=10.19.106.10 to IP address of the host PC?

Hi DaneLLL, thanks for fast reply,

It should be gst-inspect-1.0 .

It works, thanks.

  1. gst-launch-1.0 videotestsrc in UDP
    It works great, I have like 20ms latency.

  2. gst-launch-1.0 videotestsrc in TCP
    Please, correct me if i’m wrong : I have to specify the IP address of my NX, do I use tcpclientsrc or tcpserversrc :

gst-launch-1.0 tcpserversrc host=NX.Address.IP port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! avdec_h264 ! xvimagesink sync=0

I don’t receive image with this.
Transmitter pipeline (NX) :

gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1280,height=720' ! tee name=t ! nvv4l2h264enc insert-sps-pps=1 idrinterval=15 ! h264parse ! rtph264pay ! tcpserversink host=NX.Address.IP port=5000 sync=0 t. ! queue ! nvegltransform ! nveglglessink sync=0
  1. gst-launch-1.0 OpenCV in UDP
std::string gst_out = "appsrc ! video/x-raw,width=1920,height=1080 ! videoconvert ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1920,height=1080' ! tee name=t ! nvv4l2h264enc insert-sps-pps=1 idrinterval=15 ! h264parse ! rtph264pay ! udpsink host=PC.Address.IP port=5000 sync=0 t. ! queue ! nvegltransform ! nveglglessink sync=0

It works with 0-20ms latency (calculated with a few photo) notice the videoconvert, otherwise it wouldn’t push frame to the pipeline.
I also get some frame all blurred, it doesn’t happen too often thought, it can come from so much things, do you have any idea about this ?
So, it appears that the latency came from tcpserversink.

Thank you so much DaneLLL !

Now that the transmission is working really well, (I also changed the encoder from 264 to 265 and the quality is much higher). I can’t receive my stream with VLC using udp://@NX.Address.IP:PORT .
It seems that I need to define the packet size, but I can’t find which parameters can do it. Can you tell me if this is the correct way to read the stream with VLC, or is there a simpler way ?

Update : I managed to stream my video with UDP with Gstreamer on an application to VLC with a .sdp file, but I now have 500ms of latency, i use the same pipeline as before :

std::string gst_out = "appsrc ! video/x-raw,width=1920,height=1080 ! videoconvert ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! video/x-raw(memory:NVMM),width=1920,height=1080 ! tee name=t ! nvv4l2h265enc insert-sps-pps=1 idrinterval=15 ! h265parse ! rtph265pay ! udpsink host=PC.Address.IP port=5000 sync=0 t. ! queue ! nvegltransform ! nveglglessink sync=0

and my .sdp file :

v=0
m=video #PORT RTP/AVP 96
c=IN IP4 192.168.1.21
a=rtpmap:96 H265/90000

Can I reduce this latency ?

Hi,
Not sure but it seems like there is buffering mechanism in VLC. This would need other users to share experience.

It may help to run in CBR + setting virtual buffer size. It avoid burst bitrate between I and P frames and may offer better stability in limited network bandwidth. Please check the example of setting it:
Random blockiness in the picture RTSP server-client -Jetson TX2 - #5 by DaneLLL

Hi DaneLLL,
Thanks for advices,

it seems like there is buffering mechanism in VLC

Indeed, it looks like it, I could see some blurred frame regularly when I put previous pipeline into VLC. It gets corrected by adjusting the bitrate and setting the vbv-size as you told.
Now that I can use UDP, I now have 1-2 seconds latency with this pipeline :

std::string gst_out = "appsrc ! video/x-raw,width=1920,height=1080 ! videoconvert ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! video/x-raw(memory:NVMM),width=1920,height=1080 ! tee name=t ! nvv4l2h265enc EnableTwopassCBR=1 insert-sps-pps=1 idrinterval=2 bitrate=64000000 vbv-size=1600000 maxperf-enable=1 ! h265parse ! rtph265pay config-interval=1 pt=96 ! udpsink host=PC.Address.IP port=PORT sync=0 async=0  t. ! nvegltransform ! nveglglessink max-lateness=11000 sync=0";

Tried to remove these parameters : max-lateness=11000, EnableTwopassCBR=1, insert-sps-pps=1, idrinterval=2, it doesn’t seems to affect anything to the stream.

I didn’t use VLC in my solution, we receive the stream with the Windows Gstreamer library and I have something like 30ms latency with udpsink.
Here is a pipeline that works for me if someone wants to send it to VLC, but I couldn’t get better than 1s latency :

std::string gst_out = "appsrc is-live=1 ! video/x-raw, width=1920, height=1080 ! videoconvert ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! video/x-raw(memory:NVMM), width=1920, height=1080 ! tee name=t ! nvv4l2h265enc EnableTwopassCBR=1 insert-sps-pps=1 idrinterval=15 iframeinterval=1000 bitrate=64000000 vbv-size=1600000 maxperf-enable=1 preset-level=1 ! h265parse ! rtph265pay config-interval=1 pt=96 ! udpsink host=IP.address port=PORTsync=0 async=0 t. ! nvegltransform ! nveglglessink max-lateness=11000 sync=0";

The receiver needs to open a .sdp file to read that stream. Adjusting the bitrate helps suppressing blurred frame.

Thank you DaneLLL for the help.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.