RTSP server gstreamer pipeline

I want to stream live camera video of TX1 using gstreamer.
The receiver side i can able to view stream on vlc or ffmpeg or mplayer.

I can able to get tcp server pipeline by using below command.

gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)I420, width=(int)1920, height=(int)1080, framerate=60/1' ! videoconvert ! nvjpegenc ! multipartmux ! tcpserversink host= port=5000

But i want stream h264 encoded stream and using rtsp not tcp.
I tried many pipeline from the net but not getting success.
Is there any streaming guide for nvidia platform?

Hi RiteshPanchal,

In the past we had a similar issue so we created an element called rtspsink:
It uses gst-rtsp-server, the version available is for 0.10, you can give it a try to the evaluation version compiling it in your tegra:


 This week we are releasing the gstreamer 1.0 version, as soon as I get the wiki I will put it here but for the time being you can prototype with 0.10, it will work in the same way. I will add an example pipeline in the wiki to make it easier for you:

thanks for the guide…
But can you tell me how to compile gst-rr-rtsp-sink.
I follow below procedure…

root@tegra-ubuntu:/home/ubuntu/gst_1.8.0/gst-rr-rtsp-sink/src# ls
autogen.sh  configure.ac  Makefile.am  src
root@tegra-ubuntu:/home/ubuntu/gst_1.8.0/gst-rr-rtsp-sink/src# ./autogen.sh 
libtoolize: putting auxiliary files in `.'.
libtoolize: copying file `./ltmain.sh'
libtoolize: Consider adding `AC_CONFIG_MACRO_DIR([m4])' to configure.ac and
libtoolize: rerunning libtoolize, to keep the correct libtool macros in-tree.
libtoolize: Consider adding `-I m4' to ACLOCAL_AMFLAGS in Makefile.am.
configure.ac:22: installing './compile'
configure.ac:26: installing './config.guess'
configure.ac:26: installing './config.sub'
configure.ac:16: installing './install-sh'
configure.ac:16: installing './missing'
src/Makefile.am: installing './depcomp'
root@tegra-ubuntu:/home/ubuntu/gst_1.8.0/gst-rr-rtsp-sink/src# ls
aclocal.m4  autogen.sh  autom4te.cache  compile  config.guess  config.h.in  config.sub  configure  configure.ac  depcomp  install-sh  ltmain.sh  Makefile.am  Makefile.in  missing  src
root@tegra-ubuntu:/home/ubuntu/gst_1.8.0/gst-rr-rtsp-sink/src# ./configure --prefix=/usr/lib/arm-linux-gnueabihf/gstreamer-0.10/

it checks for different things and then at last give these errors

checking for GST... no
configure: error: 
      You need to install or upgrade the GStreamer development
      packages on your system. On debian-based systems these are
      libgstreamer0.10-dev and libgstreamer-plugins-base0.10-dev.
      on RPM-based systems gstreamer0.10-devel, libgstreamer0.10-devel
      or similar. The minimum version required is 0.10.8.

So what i miss???


You are currently using gstreamer version 1.0 (1.8) but the code that you downloaded is for gstreamer version 0.10. Does your system has gstreamer 0.10 too?

 We are in the process of releasing the 1.0 version, it is ready but we are generating the evaluation version, etc.

 Meanwhile, it should work if you have gstreamer 0.10. For instance, in my laptop I ran:

git clone https://github.com/RidgeRun/eval-sdk-imx6/
cd eval-sdk-imx6/proprietary/gst-rr-rtsp-sink/src
sudo apt-get install libgstrtspserver-0.10-dev
./configure --libdir=/usr/lib/x86_64-linux-gnu/
sudo make install
gst-inspect-0.10 | grep rtspsink

and got:

rtspsinkplugin: rtspsink: RR Rtsp sink element

it should work in the tegra as well.

Actually you can request the binaries for 1.0 sending an email to support@ridgerun.com


Yes i am having gstreamer 0.10 on my jetson-tx1 @/usr/lib/arm-linux-gnueabihf/gstreamer-0.10

I have just clone gst-rr-rtsp-sink folder instead of whole eval-sdk-imx6 git. So that’s why i am getting build error.
But i followed your instruction and got rtspsink compiled successfully on jetson-tx1.

Thanks for your guide.

rtspsink with gstreamer-0.10 works with the follwing pipeline.

gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)1920, height=(int)1080, framerate=30/1' ! nv_omx_h264enc ! rtspsink mapping="/mystream" service=554

And i can see stream on client PC using mplayer/vlc.

But the problem is now after ~33 sec from stream started in mplayer, gstreamer stops with the following error

root@tegra-ubuntu:/home/ubuntu/Videos# gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)1920, height=(int)1080, framerate=30/1' ! nv_omx_h264enc ! rtspsink mapping="/mystream" service=554
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingSetting pipeline to PAUSED ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
===== MSENC blits (mode: 1) into tiled surfaces =====

(gst-launch-0.10:3101): GLib-ERROR **: /build/buildd/glib2.0-2.40.2/./glib/gmem.c:103: failed to allocate 3110668 bytes
Trace/breakpoint trap

and the resolution is 1280x720 stream stop after ~24sec.

(gst-launch-0.10:3199): GLib-ERROR **: /build/buildd/glib2.0-2.40.2/./glib/gmem.c:103: failed to allocate 1382668 bytes

Glad to hear that you got it working. Do you see the problem if you run the following pipeline?

gst-launch videotestsrc ! ‘video/x-raw-yuv, width=(int)1920, height=(int)1080, framerate=30/1’ ! nv_omx_h264enc ! fakesink

No this works without giving any error.

For Now i have found alternate solution for rtsp streaming.
I am using gst-rtsp-server-1.8.0. This works perfectly.

Yes, what rtspsink does is to make use of the gst-rtsp-server but allowing you to use it with an element in the pipeline.

Really, appreciate your support… :)

So is there any limitation in gst-rtsp-server with compare to rtspsink??
Asking you as i don’t know much about gst-rtsp-server.

There is not limitation, all the features available in gst-rtsp-server can be exposed in rtspsink. RTSPsink has the gst-rtsp-server features and capabilities, while leveraging the gstreamer element flexibility, so it can easily be integrated with existing applications and pipelines as any other sink element.

Do you have the gstreamer 1.0 version based RTSP server sink.

I am looking for the RTSP Server same like above.

Let me where to download and use the 1.0 version.

Appreciate your help.

Thanks and Regards,

Hi Giri,

Sorry for my slow response, I’ve been kind of busy in some Tegra X1 customer projects, a lot of fun!;)

Yes, we have one version for 0.10 and other version for latest gstreamer 1.0. Please send the request through the contact us:

One engineer will reply and help you to get the eval version so you can play with it in your Tegra X1. We are on the process of changing the website a little bit as well and soon all the eval versions could be automatically downloaded form the site but they are still working on the details:


Let me know if you need anything else,

Hello DavidSoto,I want to use gstreamer in my opencv code to decode an hikvision IP cemera’s rtps video stream.I don’t know how to write code.I tried VideoCapture cap(“gst-launch-1.0 rtspsrc location=rtsp://admin:admin… latency=10 ! decodebin ! autovideosink”) but the cap cannot open.However when I run “gst-launch-1.0 rtspsrc location=rtsp://admin:admin… latency=10 ! decodebin ! autovideosink” command in ubuntu command window,I can preview the camera successfully.Do you know the reason? PS:my opencv is 2.4.13 opencv4tegra,my gstreamer is 1.0,my board is jetson TX1.

Hi Sulli,

Just to be clear, were you able to run a gst-launch pipeline in the TX1 and get the stream displayed? It is not clear if your problems are creating the pipeline or actually writing the application that implements the pipeline,


Hello DavidSoto,I am able to run a gst-launch pipeline in the TX1 and get the stream displayed but cannot use it successfully in my opencv code.Please check the link https://devtalk.nvidia.com/default/topic/1003211/how-can-i-use-gstreamer-in-my-opencv-program-on-tx1-board-/#5126281.There is a kind man who has help me a lot,but we still not get success result.