Gstreamer rtsp server dynamic pipelines

Hello,

I want to generate pipelines for rtsp streaming via gstreamer rtsp server, recording and live preview of a camera source. There can only be 1 instance of the camera source. The recording pipe should only be activated when needed. The other pipelines could always run.

nvarguscamerasrc
I
tee - omxvp8enc - matroskamux - filesink
I
tee - omxh264enc - rtph264pay
I
nvoverlaysink

Is it possible to give the rtsp server a pipeline which always runs (not only when a client is connected)? If this is possible, I could construct and start the live preview and streaming pipelines. The recording pipeline could then be added when needed.

I already had a look at the interpipes from ridge run. But I had some problems with them and now try to use a low level approach.

Thanks in advance

Marc

Hi,
Please utilize udpsink. Here is a reference.

Hi,

thanks for your answer. But we want to use rtsp since we only want to send the stream to the clients who requested it. Rtsp server works with the following pipeline ./gst-rtsp-server/examples/test-launch “nvarguscamerasrc ! queue ! omxh264enc ! rtph264pay name=pay0 pt=96 audiotestsrc is-live=0 ! queue ! audioconvert ! audio/x-raw,rate(int)8000,channels=2 ! alawenc ! rtppcmapay pt=97 name=pay1”. But now we don’t know how to dynamically add the recording path, since the rtsp server pipeline is destroyed when the client disconnects.

  1. Is there no way to generate and start a static pipeline and tell the rtsp server to use it (and not delete it when client disconnects)?! Thus tee elements could be used to add/delete the recording path.

  2. Another way would perhaps be to use appsink and appsrc elements similiar to the rtsp server example [url]https://github.com/GStreamer/gst-rtsp-server/blob/master/examples/test-appsrc2.c[/url].

Something like this:

nvarguscamerasrc-nvvidconv-appsink (this is a static pipeline which runs always)

appsrc- omxh264enc - rtph264pay name=pay0 pt=96 (this pipeline is constructed and deleted by rtsp server)

appsrc-omxvp8enc-matroskamux-filesink (this pipeline is constructed and deleted upon recording request)

Then buffers must be exchanged manually between appsink and appsrc. I don’t know if several appsrc can be connected to one appsink?!

Ridge Run have solved this problem with their interpipes. [url]https://developer.ridgerun.com/wiki/index.php?title=GstInterpipe_-_Example_2:_Digital_Camera[/url]. This is exactly the same use case as ours.

Thanks in advance

Marc

Hi,
For rtsp, we highly rely on gstreamer implementation. Your case is complicated and we don’t have much experience on this case. Please go to http://gstreamer-devel.966125.n4.nabble.com/

You may construct the pipeline with videotestsrc and x264enc. Once you get a solution for the usecase, it shall work fine by using nvarguscamerasrc and omxh264enc.