gstreamer dynamic pipeline

I want to use TX1 as MediaBox application with following functionalities.

  • Camera Preview on HDMI
  • Camera Video Recording on File
  • Recorded video playback
  • Streaming on Ethernet + WiFi.

Currently i am using gstreamer1.0 v1.2.4 pipeline to do all these. I can run individual function pipeline or pipeline with 2 or more functionalities.

But my requirement is on/off any functionalities in runtime on external event.
So for that, i have to add or remove any element of pipeline dynamically.
I searched on net for gstreamer SDK, python gstreamer, qt gstreamer and opencv gstreamer.

So which would be easy to learning and suitable for my application? or any other easier way for dynamic pipeline on external event?

1 Like

Hi

I had looked at a similar use case with Input Preview, Streaming, Recording about half a year ago. So I can give you a few pointers at what I found out, hopefully it is useful to you too.

Generally I found that written documentation for GStreamer is sometimes hard to process, because there are so many different use cases and software versions which contrasts with the communication over mailing lists which is at times cut short. But for this topic there is actually some good documentation: [0]

There is also a nice repository of videos from GStreamer conferences: [1]
Including videos about dynamic pipelines, e.g.: [2]
And there is a nice blog post from a GStreamer developer: [3]

In general I found that working with dynamic pipelines requires listening to pad probes and calling a callback function at the right time to modify the pipeline. The general idea is that you want to make sure you don’t modify part of a pipeline, while there data is exchanged in this element. [3] There are EVENT pad probes and (blocking) BUFFER pad probes that can be used, but it is better to use the EVENT pad probes (see [4]).

If you dont want to implement all that on your own, you might consider Interpipes, but I dont have any experience with that (see [5] / [6]).

Regards
Tobias

[0] Pipeline manipulation
[1] [url]Channels - GStreamer conferences
[2] [url]https://gstconf.ubicast.tv/videos/how-to-use-bins-to-keep-state-local-ans-easily-manage-dynamic-pipelines-part-1[/url]
[3] GStreamer Dynamic Pipelines – coaxion.net – slomo's blog
[4] [url]https://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-block.txt?h=0.11[/url]
[5] #13237 (Use Gstreamer event probe instead of pad blocking when removing ended call from pipeline) – Pidgin
[6] https://devtalk.nvidia.com/default/topic/935614/jetson-tx1/csi-camera-rendering-video-in-qt5-and-starting-stopping-a-recording-/
[7] GstInterpipe - RidgeRun Developer Connection

Hi Ritesh

Indeed GStreamer dynamic pipelines is a very tricky business. If you don’t get it right, you can start getting odd effects or even worst a complete stall of the whole pipe.

At RidgeRun we have been working with these kinds of requirements long time now. To achieve the dynamic pipeline handling, we created an open source project which you can give a try. Its called GstInterpipe, and it achieves precisely that: independent pipeline control. It was designed a simple usage in mind.

Give it a try and don’t hesitate to ask if you need help.

Michael

The project is hosted at GitHub - RidgeRun/gst-interpipe: GStreamer plug-in for interpipeline communication

Very interested in interpipes as well, but are there any example code or more tutorials than the wiki?
I’m writing an application for changing between two camera streams (+ maybe option to change resolution)

I have compiled interpipe plugin on TX1.

But can you tell some example how to use in GStreamer pipeline?
For example camera preview pipeline is running. So can i use this plugin for recording on/off on user event?

Below pipeline shows preview.

gst-launch-1.0 videotestsrc ! interpipesink name=camera interpipesrc listen-to="camera" ! autovideosink

But if i run sepearalty both run without error but no preview.

gst-launch-1.0 videotestsrc ! interpipesink name=camera
gst-launch-1.0 interpipesrc listen-to="camera" ! autovideosink

I think the problem is that when you use two separate gst-launch commands, they run in separate “contexts” and don’t know about each other. Therefore you must somehow add both pipelines to the same superordinate pipeline.

To do this you could either launch the two pipelines using the GStreamer API (e.g. in C or Python) and make sure to add them to the same overall Pipeline. Simplified example in Python:

...
pipeline = Gst.Pipeline.new("overallPipeline")
pipeline.add(srcBin)
pipeline.add(sinkBin)
...

Or alternatively you can use Gstreamer Daemon (from ridgerun as well [1]) to launch the two pipelines.
For example, this is how I can use interpipes with GStreamer Daemon:

export GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0
LD_LIBRARY_PATH=/usr/local/lib/ gstd &

gstd-client pipeline_create no_src "videotestsrc ! video/x-raw, width=1920, height=1080, format=UYVY ! interpipesink name=src_pipe"
gstd-client pipeline_create out_pipe "interpipesrc listen-to=src_pipe ! xvimagesink sync=false"

gstd-client pipeline_play no_src
gstd-client pipeline_play out_pipe

[1] Gstreamer Daemon 1.x Github GitHub - RidgeRun/gstd-1.x: GStreamer Daemon is a GStreamer framework for controlling audio and video streaming using TCP messages. This version is based on GStreamer 1.x

I compiled Gstreamer Daemon 1.x Github GitHub - RidgeRun/gstd-1.x: GStreamer Daemon is a GStreamer framework for controlling audio and video streaming using TCP messages. This version is based on GStreamer 1.x.

But below command execution gives me error

gstd-client pipeline_create no_src "videotestsrc ! video/x-raw, width=1920, height=1080, format=UYVY ! interpipesink name=src_pipe"
Could not connect to localhost: Connection refused

So what i am missing?

You need to first run the GStreamer Daemon in the background.
This is the command I use on my installation (maybe you dont need the LD_LIBRARY_PATH environment variable):

LD_LIBRARY_PATH=/usr/local/lib/ gstd &

Thanks. Now its working.

i am running following commands

LD_LIBRARY_PATH=/usr/local/lib/ gstd &
gstd-client pipeline_create no_src "videotestsrc ! interpipesink name=src_pipe"
gstd-client pipeline_create out_pipe "interpipesrc listen-to=src_pipe ! autovideosink sync=false"
gstd-client pipeline_play no_src
gstd-client pipeline_play out_pipe

But after around 25 sec preview stopped.

So is gstd is time limited demo version?

I am also trying to launch the two pipelines using the GStreamer API in C.
I am new to Gstreamer API approach. I have taken gst-sdk example for gstreamer-1.0

Below is my code

#include <gst/gst.h>

int main(int argc, char *argv[]) {
  GstElement *pipeline, *pipe1, *pipe2;
  GstBus *bus;
  GstMessage *msg;
  GstStateChangeReturn ret;

  /* Initialize GStreamer */
  gst_init (&argc, &argv);

  /* Create the elements */
  pipe1 = gst_parse_launch ("videotestsrc ! interpipesrc name=camera", NULL);
  pipe2 = gst_parse_launch ("interpipesrc name=src listen-to=camera ! autovideosink", NULL);

  /* Create the empty pipeline */
  pipeline = gst_pipeline_new ("test-pipeline");

  if (!pipeline || !pipe1 || !pipe2) {
    g_printerr ("Not all elements could be created.\n");
    return -1;
  }

  /* Build the pipeline */
  gst_bin_add_many (GST_BIN (pipeline), pipe1, pipe2, NULL);
  if (gst_element_link (pipe1, pipe2) != TRUE) {
    g_printerr ("Elements could not be linked.\n");
    gst_object_unref (pipeline);
    return -1;
  }


  /* Start playing */
  ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
  if (ret == GST_STATE_CHANGE_FAILURE) {
    g_printerr ("Unable to set the pipeline to the playing state.\n");
    gst_object_unref (pipeline);
    return -1;
  }

  /* Wait until error or EOS */
  bus = gst_element_get_bus (pipeline);
  msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);

  /* Parse message */
  if (msg != NULL) {
    GError *err;
    gchar *debug_info;

    switch (GST_MESSAGE_TYPE (msg)) {
      case GST_MESSAGE_ERROR:
        gst_message_parse_error (msg, &err, &debug_info);
        g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
        g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
        g_clear_error (&err);
        g_free (debug_info);
        break;
      case GST_MESSAGE_EOS:
        g_print ("End-Of-Stream reached.\n");
        break;
      default:
        /* We should not reach here because we only asked for ERRORs and EOS */
        g_printerr ("Unexpected message received.\n");
        break;
    }
    gst_message_unref (msg);
  }

  /* Free resources */
  gst_object_unref (bus);
  gst_element_set_state (pipeline, GST_STATE_NULL);
  gst_object_unref (pipeline);
  return 0;
}

I am getting following error.

Elements could not be linked.

So can you help me out?

For the gstDaemon approach:
I ran into a problem with “Out of memory” where it stopped after I think 10-20 seconds. But I could work around that when I used “xvimagesink sync=false” as video sink. (In order to use xvimagesink, you need to first “export DISPLAY=:0”)
Other than that, the pipeline should not stop after 25 seconds. It is not time limited.
Could you try these commands:

export DISPLAY=:0
export GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0
LD_LIBRARY_PATH=/usr/local/lib/ gstd &

gstd-client pipeline_create no_src "videotestsrc ! video/x-raw, width=1920, height=1080, format=UYVY ! interpipesink name=src_pipe"
gstd-client pipeline_create out_pipe "interpipesrc listen-to=src_pipe ! xvimagesink sync=false"

gstd-client pipeline_play no_src
gstd-client pipeline_play out_pipe

About the approach with GStreamer API:
The way I understand interpipes, you do not need to link the two pipelines. Only add them to the overall pipeline and run all (pipe1, pipe2, pipeline). This is exactly the advantage of using interpipes, you can just add a new pipeline and connect them with interpipesrc and interpipesink, no need to link them.

Thanks for all your replies. Its saved lot of my time.
Yes gstd is not time limited.
If i added 1080p resolution with autovideosink, its not stopping at 25 sec.
It may be some memory issue with previous pipeline.

gstd-client pipeline_create no_src "videotestsrc ! video/x-raw, width=1920, height=1080, format=I420, framerate=60/1 ! interpipesink name=src_pipe"
gstd-client pipeline_create out_pipe "interpipesrc listen-to=src_pipe ! autovideosink sync=false"

gstd-client pipeline_play no_src
gstd-client pipeline_play out_pipe

About the approach with GStreamer API:
How to add pipe1 and pipe2 to overall pipeline?

I have tried to run above code without gst_element_link (pipe1, pipe2) I got following error.

pipe1 = gst_parse_launch ("videotestsrc ! interpipesrc name=camera", NULL);
  pipe2 = gst_parse_launch ("interpipesrc name=src listen-to=camera ! autovideosink", NULL);

  /* Create the empty pipeline */
  pipeline = gst_pipeline_new ("test-pipeline");

  if (!pipeline || !pipe1 || !pipe2) {
    g_printerr ("Not all elements could be created.\n");
    return -1;
  }

  /* Build the pipeline */
  gst_bin_add_many (GST_BIN (pipeline), pipe1, pipe2, NULL);


  /* Start playing */
  ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
  if (ret == GST_STATE_CHANGE_FAILURE) {
    g_printerr ("Unable to set the pipeline to the playing state.\n");
    gst_object_unref (pipeline);
    return -1;
  }
ror received from element videotestsrc0: Internal data flow error.
Debugging information: gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:test-pipeline/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0:
streaming task paused, reason not-linked (-1)

Sorry i am asking very basic questions.
But can you give any code snippet for this C gstreammer application?

I think you are on the right path, what you are looking for is adding bins to a pipeline: Bins

In your case pipe1 and pipe2 are bins, which you want to add to the pipeline. You dont need the function “gst_bin_add_many()”, since it is for adding elements to a bin, not for adding bins to a pipeline. You dont need to do this, since elements were already created, added and linked to pipe1, pipe2 by the “gst_parse_launch()” calls.

I suggest to replace this line:

gst_bin_add_many (GST_BIN (pipeline), pipe1, pipe2, NULL);

by:

gst_bin_add (GST_BIN (pipeline), pipe1);
gst_bin_add (GST_BIN (pipeline), pipe2);

Unfortunately I dont have an easy C example ready, but I think your code should work with very little changes…

Thanks for the Reply.

I tried replacing gst_bin_add_many with gst_bin_add as mentioned by you but still getting the same error.

Hey, I just found that there are some C examples in the files that you used to compile interpipes. Have a look at “gst-interpipe/tests/check/gst/”.

A relatively simple example is “test_set_caps.c”. It looks like you dont even need to add the different bins (pipe1, pipe2) to an overall pipeline. Just create them with “gst_parse_launch()” and run them with “gst_element_change_state()”.

Sorry for the slight misinformation!

Hello,

Thanks for your continues help.

I tried compiling test_set_caps.c without any modification but it gives me error.
I took reference of test_set_caps.c and modified code as per below.

#include <gst/gst.h>

int main(int argc, char *argv[]) {
  GstPipeline *sink;
  GstPipeline *src1;
  GError *error = NULL;

  /* Initialize GStreamer */
  gst_init (&argc, &argv);

  /* Create one sink and three source pipelines */
  sink = GST_PIPELINE (gst_parse_launch ("videotestsrc ! interpipesink name=sink", &error));
  src1 = GST_PIPELINE (gst_parse_launch ("interpipesrc listen-to=sink ! autovideosink", &error));

  /* Play the pipelines */
  //gst_element_set_state (GST_ELEMENT (sink), GST_STATE_PLAYING);
  //gst_element_set_state (GST_ELEMENT (src1), GST_STATE_PLAYING);
  gst_element_change_state (GST_ELEMENT (sink), GST_STATE_CHANGE_PAUSED_TO_PLAYING);
  gst_element_change_state (GST_ELEMENT (src1), GST_STATE_CHANGE_PAUSED_TO_PLAYING);

  sleep(10);

  /* Stop pipelines */
  gst_element_change_state (GST_ELEMENT (sink), GST_STATE_CHANGE_READY_TO_NULL);
  gst_element_change_state (GST_ELEMENT (src1), GST_STATE_CHANGE_READY_TO_NULL);

  /* Cleanup */
  g_object_unref (sink);
  g_object_unref (src1);

  return 0;
}

But only one frame is displayed and preview stuck and after 10 sec code stops.

If i tried to launch single preview pipeline using gst_element_change_state the same thing happens, preview stuck after 1 frame.

sink = GST_PIPELINE (gst_parse_launch ("videotestsrc ! autovideosink", &error));
gst_element_change_state (GST_ELEMENT (sink), GST_STATE_CHANGE_PAUSED_TO_PLAYING);

But replacing gst_element_change_state with gst_element_set_state playes preview perfectly to 10 sec.

sink = GST_PIPELINE (gst_parse_launch ("videotestsrc ! autovideosink", &error));
gst_element_set_state (GST_ELEMENT (sink), GST_STATE_PLAYING);;

Hi Ritesh

Glad to hear you got the test pipelines running. I’m going to add some examples to the project and documentation so users can use them as a reference. The code in tests/check/ is more oriented to test a specific feature and hence may not always be the best application code.

Anyway, now that you have this example working, were you able to get your application running? Did you encounter any other problems?

Hello michael,

Still i am getting problem in using interpipe plugins in Gstreamer C application.
I can use those using gstd.

In C application,

sink = GST_PIPELINE (gst_parse_launch ("videotestsrc ! interpipesink name=sink", &error));
src1 = GST_PIPELINE (gst_parse_launch ("interpipesrc listen-to=sink ! autovideosink", &error));

gst_element_change_state (GST_ELEMENT (sink), GST_STATE_CHANGE_PAUSED_TO_PLAYING);
gst_element_change_state (GST_ELEMENT (src1), GST_STATE_CHANGE_PAUSED_TO_PLAYING);

This only show 1 frame and stuck.
So how to play 2 separate pipelines in Gstreamer C application?
As i am new to this C application approach i am facing this difficulties.

Hello Now i am testing interpipe in Qt Gstreamer.

Following works perfectly.

m_pipeline      = QGst::Pipeline::create();

QGst::BinPtr srcBin = QGst::Bin::fromDescription("videotestsrc ! interpipesink name=camera");
QGst::BinPtr sinkBin = QGst::Bin::fromDescription("interpipesrc listen-to=camera ! autovideosink");

m_pipeline->add(srcBin);
m_pipeline->add(sinkBin);

Now i want to take snap while recording running. Below is my code for init

m_pipeline      = QGst::Pipeline::create();
m_pipeline1     = QGst::Pipeline::create();

QGst::BinPtr srcBin = QGst::Bin::fromDescription("videotestsrc ! interpipesink name=camera");
QGst::BinPtr sinkBin = QGst::Bin::fromDescription("interpipesrc listen-to=camera ! autovideosink");
snapBin = QGst::Bin::fromDescription("interpipesrc listen-to=camera num-buffers=1 ! nvjpegenc ! filesink location=/home/ubuntu/1.jpg");

m_pipeline->add(srcBin);
m_pipeline->add(sinkBin);

m_pipeline1->add(snapBin);

And code for taking snap

void Player::snap()
{
    m_pipeline1->setState(QGst::StatePlaying);
    QThread::msleep(100);
    m_pipeline1->setState(QGst::StateNull);
}

But the problem is that First snap is taken OK. But if i trying to take snap second time the image file comes out empty.
So how to restart m_pipeline1 so that it takes and store snap every time.