Streaming desktop with RTSP Gstreamer server

there is a pipeline showing how to record the desktop using gstreamer

gst-launch-1.0 -v ximagesrc use-damage=0 ! nvvidconv ! 'video/x-raw(memory:NVMM),alignment=au,format=I420,framerate=25/1,pixel-aspect-ratio=1/1' ! omxh264enc ! 'video/x-h264,stream-format=byte-stream' ! filesink location="test.h264" -e

could you advise how to stream the desktop with gstreamer server using the test-launch, please?

Trial 1

sudo apt-get install libgstrtspserver-1.0 libgstreamer1.0-dev
wget https://gstreamer.freedesktop.org/src/gst-rtsp/gst-rtsp-server-1.14.1.tar.xz
tar -xvf gst-rtsp-server-1.14.1.tar.xz
cd  gst-rtsp-server-1.14.1
cd examples
gcc test-launch.c -o test-launch $(pkg-config --cflags --libs gstreamer-1.0 gstreamer-rtsp-server-1.0)

./test-launch "ximagesrc use-damage=0"
stream ready at rtsp://127.0.0.1:8554/test

(test-launch:24295): GLib-GObject-WARNING **: 07:38:33.448: invalid cast from 'GstXImageSrc' to 'GstBin'

Trial 2

./test-launch "ximagesrc use-damage=0 ! nvvidconv ! 'video/x-raw(memory:NVMM),alignment=au,format=I420,framerate=25/1,pixel-aspect-ratio=1/1' ! omxh264enc ! 'video/x-h264,stream-format=byte-stream'"
stream ready at rtsp://127.0.0.1:8554/test

The objective of the attempt is to pair a desktop streaming with x2x in order to have experience of a local desktop using remote xavier device. So it will be possible to extend local keyboard and mouse to a remote desktop being streamed.
x2x controls:

ssh -X user@0.tcp.ngrok.io p 12345 'x2x -west -to :1'

However the issue is that client can not connect to the rtsp strewam, neither vlc, nor gstreamer client like

gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! queue ! decodebin ! nvoverlaysink
1 Like

I’m unable to test for now, but you would try:

gst-launch-1.0 -v ximagesrc use-damage=0 ! nvvidconv ! omxh264enc ! video/x-h264, profile=baseline ! h264parse ! video/x-h264, stream-format=byte-stream ! rtph264pay ! fakesink

If this works, you would adjust for test-launch:

test-launch "ximagesrc use-damage=0 ! nvvidconv ! omxh264enc ! video/x-h264, profile=baseline ! h264parse ! video/x-h264, stream-format=byte-stream ! rtph264pay name=pay0 pt=96 "

Thank you for your response!

 gst-launch-1.0 -v ximagesrc use-damage=0 ! nvvidconv ! omxh264enc ! video/x-h264, profile=baseline ! h264parse ! video/x-h264, stream-format=byte-stream ! rtph264pay ! fakesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstXImageSrc:ximagesrc0.GstPad:src: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)25/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)25/1, pixel-aspect-ratio=(fraction)1/1
Framerate set to : 25 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
H264: Profile = 66, Level = 40 
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)25/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)25/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:src: caps = video/x-h264, alignment=(string)au, profile=(string)baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, alignment=(string)au, profile=(string)baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, alignment=(string)au, profile=(string)baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, parsed=(boolean)true
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-h264, alignment=(string)au, profile=(string)baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, parsed=(boolean)true
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, alignment=(string)au, profile=(string)baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, parsed=(boolean)true
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-h264, alignment=(string)au, profile=(string)baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, parsed=(boolean)true
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, alignment=(string)au, profile=(string)baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, alignment=(string)au, profile=(string)baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, alignment=(string)au, profile=(string)constrained-baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-h264, alignment=(string)au, profile=(string)constrained-baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, alignment=(string)au, profile=(string)constrained-baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-h264, alignment=(string)au, profile=(string)constrained-baseline, level=(string)4, stream-format=(string)byte-stream, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, ssrc=(uint)908002652, timestamp-offset=(uint)2314467118, seqnum-offset=(uint)10059, a-framerate=(string)25
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, ssrc=(uint)908002652, timestamp-offset=(uint)2314467118, seqnum-offset=(uint)10059, a-framerate=(string)25
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, sprop-parameter-sets=(string)"Z0JAKJWgHgCJ+VA\=\,aM48gA\=\=", payload=(int)96, seqnum-offset=(uint)10059, timestamp-offset=(uint)2314467118, ssrc=(uint)908002652, a-framerate=(string)25
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, sprop-parameter-sets=(string)"Z0JAKJWgHgCJ+VA\=\,aM48gA\=\=", payload=(int)96, seqnum-offset=(uint)10059, timestamp-offset=(uint)2314467118, ssrc=(uint)908002652, a-framerate=(string)25
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 2314575118
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 10059

test launch seems to start server,
however, connections to it do not seem to connect.

Back to my NX, it seems working . I used this pipeline for receiving on host:

gst-launch-1.0 rtspsrc location=rtsp://<Jetson_IP>:8554/test ! application/x-rtp, media=video, encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! xvimagesink

like that?

server

./test-launch "ximagesrc use-damage=0 ! nvvidconv ! omxh264enc ! video/x-h264, profile=baseline ! h264parse ! video/x-h264, stream-format=byte-stream ! rtph264pay name=pay0 pt=96 "

client

 gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! application/x-rtp, media=video, encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! xvimagesink

or like that:
server

gst-launch-1.0 -v ximagesrc use-damage=0 ! nvvidconv ! omxh264enc ! video/x-h264, profile=baseline ! h264parse ! video/x-h264, stream-format=byte-stream ! rtph264pay ! fakesink

client

gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! application/x-rtp, media=video, encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! xvimagesink

Both or neither of the two?

Server requires test-launch. The gst-launch command was just for checking that the pipeline runs fine, because inside test-launch the error messages are filtered out.

test-launch "ximagesrc use-damage=0 ! nvvidconv ! omxh264enc ! video/x-h264, profile=baseline ! h264parse ! video/x-h264, stream-format=byte-stream ! rtph264pay name=pay0 pt=96 "

Client:

gst-launch-1.0 rtspsrc location=rtsp://<Jetson_IP>:8554/test ! application/x-rtp, media=video, encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! xvimagesink

Note that if you run client on Jetson itself that the display will show a cascade of windows because ximagesrc takes the whole screen but xvimagesink displays in a smaller window.
Also note that the quality is not so good, you may want to try various options of the encoder, or try nvv4l2h264enc as well.

at AGX by some reason it throws

 gst-launch-1.0 rtspsrc location=rtsp://127.0.01:8554/test ! application/x-rtp, media=video, encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.0.01:8554/test
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not open resource for reading and writing.
Additional debug info:
gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

I shall try at NX

Might just be a typo in the IP address lacking a dot !

exactly!
good catch!
Thank you very much!

will it work with h.265?

./test-launch "ximagesrc use-damage=0 ! nvvidconv ! omxh265enc ! video/x-h265, profile=baseline ! h265parse ! video/x-h265, stream-format=byte-stream ! rtph265pay name=pay0 pt=96

Yes. You may get better quality.
Server:

./test-launch "ximagesrc use-damage=0 ! nvvidconv ! nvv4l2h265enc ! h265parse ! video/x-h265, stream-format=byte-stream ! rtph265pay name=pay0 pt=96 "

Client:

gst-launch-1.0 -v rtspsrc location=rtsp://<Jetson_IP>:8554/test ! application/x-rtp, media=video, encoding-name=H265 ! rtph265depay ! avdec_h265 ! xvimagesink

# local client can use jeston NVDEC
 gst-launch-1.0 -v rtspsrc location=rtsp://127.0.0.1:8554/test ! application/x-rtp, media=video, encoding-name=H265 ! rtph265depay ! nvv4l2decoder ! nvvidconv ! xvimagesink

Thx @Andrey1984

I was wondering if a source existed to capture the screen. That’s very useful to me.

add x2x and you will get a full remote desktop solution

@Andrey1984

Yeah. Was think of looking for some existing solution using GStreamer that could be modified to use Nvidia’s elements. Usually decode/encode bins can handle it automatically based on element rank but not always. From experience, encodebin is more difficult to use than decodebin. If you’re aware of anything close to what I am thinking of, lmk.

Streaming from headless nano Virtual Display - is it possible?
Steps to reproduce:

  1. put nx into multiuser mode & reboot
    sudo systemctl set-default multi-user.target
  2. ssh into nx & execute
sudo startx &
DISPLAY=:0 sudo xhost +
export DISPLAY=:0
 ./test-launch "ximagesrc use-damage=0 ! nvvidconv ! nvv4l2h265enc ! h265parse ! video/x-h265, stream-format=byte-stream ! rtph265pay name=pay0 pt=96 "

From another network device connect with

gst-launch-1.0 -v rtspsrc location=rtsp://<Jetson_IP>:8554/test ! application/x-rtp, media=video, encoding-name=H265 ! rtph265depay ! avdec_h265 ! xvimagesink

then adding
x2x mouse and keyboard from host [AGX] to NX virtual display:

ssh -X user@ipaddress 'x2x -west -to :0'

/usr/bin/xauth:  timeout in locking authority file /home/nvidia/.Xauthority
X11 connection rejected because of wrong authentication.
x2x - error: can not open display localhost:11.0

I understand that there might be required equivalent of the follwing command that seems not applicable to Jetson OS:

sudo nvidia-xconfig --allow-empty-initial-configuration --enable-all-gpus  --force-generate


probably somehow should be possible to execute an app to the display shown in the right?
Here trey reffer to using nvidia-xconfig, but the latter doesn’t seem presented into the system
https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%20Linux%20Driver%20Package%20Development%20Guide/window_system_x11.html

It seems I can change the underlying desktop resolution from the terminal with the command below

 xrandr --fb 1280x960