I am working on a project that needs to capture desktop and stream it in a browser.
For capturing the screen I use VIDEO CODEC SDK and create h246 frames that I send out through tcp port (that is a C++ program). For streaming I tried both, gstreamer and ffmpeg. From a different linux box that acts as a server, I connect to the tcp port of the nvidia machine, where I get the frames and then create either HLS or DASH segments. For playing the stream I tried video.js and dash.js in html. All of these trials worked, I got to see the desktop in the browser however there is a buffering. Seems that my segments and/or the playlist files ( manifest files m3u8 or mpd) are created too slow, the browser requests it few times till it gets updated playlist and then resumes streaming the video. Basically every couple seconds there is a buffering of 2-3 seconds. I suspect the issue is in the slowness of creating and writing those segments. When I decreased the resolution from 3840x2160 to 1920x1200 the slowness decreased, but still could see regular but shorter bufferings in the browser.
The only way I don’t see buffering/delay is if I stream directly to another desktop, just decode and send to audivideosink), basically no re-encoding, no writing hls or dash segments and no playlist
These are few of the pipelines I have tried for streaming in browser and observed the delays.
gst-launch-1.0 -v tcpclientsrc port=8081 host=nvidia_linuxIP
! x264enc ! h264parse
! hlssink2 playlist-root=http://other_linuxIP:8082/gst_tcp location=gst_tcp/segment_%05d.ts playlist-location=gst_tcp/playlist.m3u8 target-duration=5 max-files=15
ffmpeg -i tcp:// nvidia_linuxIP:8081 -c:v:0 libx264 -x264-params “nal-hrd=cbr:force-cfr=1” \
-b:v:0 5M -maxrate:v:0 5M -minrate:v:0 5M -bufsize:v:0 10M -preset slow -g 48 -sc_threshold 0\
-keyint_min 48 -sc_threshold 0 -keyint_min 48 \
-segment_list master.m3u8 -segment_list_type hls -segment_list_size 10 -segment_list_flags +live -segment_time 2 out%03d.ts
This pipeline below has no delay/buffering (and is not using browser):
gst-launch-1.0 -v tcpclientsrc port=8081 host= nvidia_linuxIP ! video/x-h264, mapping=/stream1 , height=1920, width=1080, framerate=30/1 ! decodebin ! autovideosink sync=false
My question is, whether there is a better way that NVIDIA provides to do the capturing and streaming. Currently what I do is when getting the encoded data from the tcp connection, I decode it and then re-encode (with gstreamer or ffmpeg). Seems redundant but couldn’t find a better way.