Gstreamer Interpipe wont run in jetson nano

I have successfully installed gstreamer interpipes.
I tried the below commands to run a test

gst-launch-1.0 v4l2src device=/dev/video0 ! interpipesink name=vid

gst-launch-1.0 interpipesrc listen-to=vid ! xvimagesink
I an unable to see a preview
Below is the output when the above command is executed
Setting pipeline to PAUSED …
Pipeline is PREROLLING …

What can be the solution for this?

My application is to preview, record and stream the video from the source. I tried tee element but when the stream failed the whole pipeline stops. Hence I started to look into interpipes.

How can I achieve this using gstreamer interpipes?

Hi,
These are custom plugins. Please contact RidgeRun to get further help.

1 Like

Hello @pulzappcheck890,

RidgeRun is the developer of the gst-interpipe plugin. We can provide you further help with the plugin.

Please follow these instructions:

1- Install the GStreamer Daemon needed to handle easily the interpipes pipelines. Follow instructions here:
https://developer.ridgerun.com/wiki/index.php?title=GStreamer_Daemon_-_Building_GStreamer_Daemon

2- After installation, run the gstreamer daemon with the following command

gstd

3- Create the camera src pipeline (pipe1), defining caps and decoding, here we are using a jpeg camera, therefore we need parse and decoding, for example:

gst-client pipeline_create pipe1 v4l2src device=/dev/video0 ! image/jpeg,width=1920,height=1080,framerate=30/1 ! jpegparse ! jpegdec ! interpipesink name=vid sync=false

4- Create the receiving pipeline (pipe2)

gst-client pipeline_create pipe2 interpipesrc listen-to=vid ! xvimagesink sync=false

5- Start the pipelines in the following order

gst-client pipeline_play pipe1
gst-client pipeline_play pipe2

Regards,
Fabian
www.ridgerun.com

1 Like

Thank you @fabian.solano , I’ve tried the below method.

My requirement is to run two or more separate pipelines using a single source, and if one pipeline fails the others must not stop and the failed pipeline must have the ability to be restarted without affecting the running pipelines.

I tried the below example

I have a camera source and started to play and record the pipeline using the below command using gstreamer interpipes.

gst-launch-1.0 -e v4l2src device=/dev/video0 ! interpipesink name=vid interpipesrc listen-to=vid ! ‘video/x-raw, framerate=30/1, format=YUY2’ ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! omxh264enc ! mpegtsmux name=mux ! filesink location=usb.ts interpipesrc listen-to=vid ! xvimagesink

The pipeline is working the recording happens while being played.

Problem:
As soon as I close the preview the recording wont stop but the within 2-3minutes RAM usage of the jetson nano increases and the whole thing gets stuck.

Is there a method to solve this?

Thank You

Hi @fabian.solano,

I successfully installed gstd and tried the below pipelines.

gst-client pipeline_create pipe1 v4l2src device=/dev/video0 ! interpipesink name=vid sync=false

gst-client pipeline_create pipe2 interpipesrc listen-to=vid ! xvimagesink sync=false

gst-client pipeline_create pipe3 interpipesrc listen-to=vid ! ‘video/x-raw, framerate=30/1, format=YUY2’ ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! omxh264enc ! mpegtsmux name=mux ! filesink location=cam.ts sync=false

And started the pipeleines in the following order

gst-client pipeline_play pipe1
gst-client pipeline_play pipe2
gst-client pipeline_play pipe3

when gst-client pipeline_play pipe3 is executed i get the bellow error

{
“code” : 14,
“description” : “State error”,
“response” : null
}

How can I solve this?

Hi @pulzappcheck890,

Probably you are getting “State error”, because the pipeline is not linking correctly. Please try to run with the environment flag GST_DEBUG=4, and provide the log of the error.

Regards,
Fabian
www.ridgerun.com

Thank you @fabian.solano

I tried it with GST_DEBUG=4 as below

GST_DEBUG=4 gst-client pipeline_play pipe3

Still I get the same error

Below I have attached the response when the pipe3 is created

gst-client pipeline_create pipe3 interpipesrc listen-to=vid ! ‘video/x-raw, framerate=30/1, format=YUY2’ ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! omxh264enc ! mpegtsmux name=mux ! filesink location=cam.ts sync=false
{
“code” : 0,
“description” : “Success”,
“response” : {
“properties” : [
{
“name” : “name”,
“value” : ““pipe3"”,
“param” : {
“description” : “The name of the current Gstd session”,
“type” : “gchararray”,
“access” : “((GstdParamFlags) READ | 234)”
}
},
{
“name” : “description”,
“value” : ““interpipesrc listen-to=vid ! video/x-raw, framerate=30/1, format=YUY2 ! nvvidconv ! video/x-raw(memory:NVMM),format=NV12 ! omxh264enc ! mpegtsmux name=mux ! filesink location=cam.ts sync=false””,
“param” : {
“description” : “The gst-launch like pipeline description”,
“type” : “gchararray”,
“access” : “((GstdParamFlags) READ | 234)”
}
},
{
“name” : “elements”,
“value” : “((GstdList*) 0x7fa0145ab0)”,
“param” : {
“description” : “The elements in the pipeline”,
“type” : “GstdList”,
“access” : “((GstdParamFlags) READ | 224)”
}
},
{
“name” : “bus”,
“value” : “((GstdPipelineBus*) 0x7fa02414e0)”,
“param” : {
“description” : “The bus callback for this element”,
“type” : “GstdPipelineBus”,
“access” : “((GstdParamFlags) READ | 224)”
}
},
{
“name” : “state”,
“value” : “((GstdState*) 0x7fa022b6a0)”,
“param” : {
“description” : “The state of the pipeline”,
“type” : “GstdState”,
“access” : “((GstdParamFlags) READ | UPDATE | 226)”
}
},
{
“name” : “event”,
“value” : “((GstdEventHandler*) 0x7fa023d000)”,
“param” : {
“description” : “The event handler of the pipeline”,
“type” : “GstdEventHandler”,
“access” : “((GstdParamFlags) READ | 224)”
}
},
{
“name” : “position”,
“value” : 0,
“param” : {
“description” : “The query position of the tion Error
Jetson Nano
pytorch
1 14 1h
There are 168 new topics remaining, or browse other topics in
Jetson Nano
Copyright © 2020 NVIDIA Corporation Legal Information Privacy Policy Contact Cookie policy

pipeline”,
“type” : “gint64”,
“access” : “((GstdParamFlags) READ | 224)”
}
},
{
“name” : “duration”,
“value” : 0,
“param” : {
“description” : “The duration of the media stream pipeline”,
“type” : “gint64”,
“access” : “((GstdParamFlags) READ | 224)”
}
},
{
“name” : “graph”,
“value” : ““digraph pipeline {\n rankdir=LR;\n fontname=\“sans\”;\n fontsize=\“10\”;\n labelloc=t;\n nodesep=.1;\n ranksep=.2;\n label=\”\\npipe3\\n[0]\”;\n node [style=\“filled,rounded\”, shape=box, fontsize=\“9\”, fontname=\“sans\”, margin=\“0.0,0.0\”];\n edge [labelfontsize=\“6\”, fontsize=\“9\”, fontname=\“monospace\”];\n \n legend [\n pos=\“0,0!\”,\n margin=\“0.05,0.05\”,\n style=\“filled\”,\n label=\“Legend\\lElement-States: [~] void-pending, [0] null, [-] ready, [=] paused, [>] playing\\lPad-Activation: [-] none, [>] push, [<] pull\\lPad-Flags: [b]locked, [f]lushing, [b]locking, [E]OS; upper-case is set\\lPad-Task: [T] has started task, [t] has paused task\\l\”,\n ];\n subgraph cluster_capsfilter1_0x7fa022c440 {\n fontname=\“Bitstream Vera Sans\”;\n fontsize=\“8\”;\n style=\“filled,rounded\”;\n color=black;\n label=\“GstCapsFilter\\ncapsfilter1\\n[0]\\nparent=(GstPipeline) pipe3\\ncaps=video/x-raw(memory:NVMM), format=(string)NV12\”;\n subgraph cluster_capsfilter1_0x7fa022c440_sink {\n label=\”\";\n style=\“invis\”;\n capsfilter1_0x7fa022c440_sink_0x7fa022e150 [color=black, fillcolor=\"#aaaaff\", label=\“sink\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n subgraph cluster_capsfilter1_0x7fa022c440_src {\n label=\"\";\n style=\“invis\”;\n capsfilter1_0x7fa022c440_src_0x7fa022e3a0 [color=black, fillcolor=\"#ffaaaa\", label=\“src\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n capsfilter1_0x7fa022c440_sink_0x7fa022e150 -> capsfilter1_0x7fa022c440_src_0x7fa022e3a0 [style=\“invis\”];\n fillcolor=\"#aaffaa\";\n }\n\n capsfilter1_0x7fa022c440_src_0x7fa022e3a0 -> omxh264enc_omxh264enc0_0x7fa021a670_sink_0x7fa00270a0 [labeldistance=\“10\”, labelangle=\“0\”, label=\" \", taillabel=\“ANY\”, headlabel=\“video/x-raw(memory:NVMM)\\l format: { (string)I420, (str… }\\l width: [ 1, 2147483647 ]\\l height: [ 1, 2147483647 ]\\l framerate: [ 0/1, 2147483647/1 ]\\lvideo/x-raw\\l format: { (string)I420, (str… }\\l width: [ 1, 2147483647 ]\\l height: [ 1, 2147483647 ]\\l framerate: [ 0/1, 2147483647/1 ]\\l\”]\n subgraph cluster_capsfilter0_0x7fa022c100 {\n fontname=\“Bitstream Vera Sans\”;\n fontsize=\“8\”;\n style=\“filled,rounded\”;\n color=black;\n label=\“GstCapsFilter\\ncapsfilter0\\n[0]\\nparent=(GstPipeline) pipe3\\ncaps=video/x-raw, framerate=(fraction)30/1, format=(string)YUY2\”;\n subgraph cluster_capsfilter0_0x7fa022c100_sink {\n label=\"\";\n style=\“invis\”;\n capsfilter0_0x7fa022c100_sink_0x7fa00279e0 [color=black, fillcolor=\"#aaaaff\", label=\“sink\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n subgraph cluster_capsfilter0_0x7fa022c100_src {\n label=\"\";\n style=\“invis\”;\n capsfilter0_0x7fa022c100_src_0x7fa0027c30 [color=black, fillcolor=\"#ffaaaa\", label=\“src\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n capsfilter0_0x7fa022c100_sink_0x7fa00279e0 -> capsfilter0_0x7fa022c100_src_0x7fa0027c30 [style=\“invis\”];\n fillcolor=\"#aaffaa\";\n }\n\n capsfilter0_0x7fa022c100_src_0x7fa0027c30 -> nvvconv0_0x7fa0212110_sink_0x7fa0026c00 [labeldistance=\“10\”, labelangle=\“0\”, label=\" \", taillabel=\“ANY\”, headlabel=\“video/x-raw(memory:NVMM)\\l format: { (string)I420, (str… }\\l width: [ 1, 2147483647 ]\\l height: [ 1, 2147483647 ]\\l framerate: [ 0/1, 2147483647/1 ]\\lvideo/x-raw\\l format: { (string)I420, (str… }\\l width: [ 1, 2147483647 ]\\l height: [ 1, 2147483647 ]\\l framerate: [ 0/1, 2147483647/1 ]\\l\”]\n subgraph cluster_filesink0_0x7fa0228f90 {\n fontname=\“Bitstream Vera Sans\”;\n fontsize=\“8\”;\n style=\“filled,rounded\”;\n color=black;\n label=\“GstFileSink\\nfilesink0\\n[0]\\nparent=(GstPipeline) pipe3\\nsync=FALSE\\nlocation=\\\“cam.ts\\\”\”;\n subgraph cluster_filesink0_0x7fa0228f90_sink {\n label=\"\";\n style=\“invis\”;\n filesink0_0x7fa0228f90_sink_0x7fa0027790 [color=black, fillcolor=\"#aaaaff\", label=\“sink\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n fillcolor=\"#aaaaff\";\n }\n\n subgraph cluster_mux_0x7fa021e060 {\n fontname=\“Bitstream Vera Sans\”;\n fontsize=\“8\”;\n style=\“filled,rounded\”;\n color=black;\n label=\“MpegTsMux\\nmux\\n[0]\\nparent=(GstPipeline) pipe3\”;\n subgraph cluster_mux_0x7fa021e060_sink {\n label=\"\";\n style=\“invis\”;\n mux_0x7fa021e060_sink_65_0x7fa022e5f0 [color=black, fillcolor=\"#aaaaff\", label=\“sink_65\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,dashed\”];\n }\n\n subgraph cluster_mux_0x7fa021e060_src {\n label=\"\";\n style=\“invis\”;\n mux_0x7fa021e060_src_0x7fa0027540 [color=black, fillcolor=\"#ffaaaa\", label=\“src\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n mux_0x7fa021e060_sink_65_0x7fa022e5f0 -> mux_0x7fa021e060_src_0x7fa0027540 [style=\“invis\”];\n fillcolor=\"#aaffaa\";\n }\n\n mux_0x7fa021e060_src_0x7fa0027540 -> filesink0_0x7fa0228f90_sink_0x7fa0027790 [labeldistance=\“10\”, labelangle=\“0\”, label=\" \", taillabel=\“video/mpegts\\l systemstream: true\\l packetsize: { (int)188, (int)192 }\\l\”, headlabel=\“ANY\”]\n subgraph cluster_omxh264enc_omxh264enc0_0x7fa021a670 {\n fontname=\“Bitstream Vera Sans\”;\n fontsize=\“8\”;\n style=\“filled,rounded\”;\n color=black;\n label=\“GstOMXH264Enc-omxh264enc\\nomxh264enc-omxh264enc0\\n[0]\\nparent=(GstPipeline) pipe3\\niframeinterval=0\”;\n subgraph cluster_omxh264enc_omxh264enc0_0x7fa021a670_sink {\n label=\"\";\n style=\“invis\”;\n omxh264enc_omxh264enc0_0x7fa021a670_sink_0x7fa00270a0 [color=black, fillcolor=\"#aaaaff\", label=\“sink\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n subgraph cluster_omxh264enc_omxh264enc0_0x7fa021a670_src {\n label=\"\";\n style=\“invis\”;\n omxh264enc_omxh264enc0_0x7fa021a670_src_0x7fa00272f0 [color=black, fillcolor=\"#ffaaaa\", label=\“src\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n omxh264enc_omxh264enc0_0x7fa021a670_sink_0x7fa00270a0 -> omxh264enc_omxh264enc0_0x7fa021a670_src_0x7fa00272f0 [style=\“invis\”];\n fillcolor=\"#aaffaa\";\n }\n\n omxh264enc_omxh264enc0_0x7fa021a670_src_0x7fa00272f0 -> mux_0x7fa021e060_sink_65_0x7fa022e5f0 [labeldistance=\“10\”, labelangle=\“0\”, label=\" \", taillabel=\“video/x-h264\\l width: [ 16, 4096 ]\\l height: [ 16, 4096 ]\\l stream-format: { (string)byte-strea… }\\l alignment: au\\l\”, headlabel=\“video/mpeg\\l parsed: true\\l mpegversion: { (int)1, (int)2, (i… }\\l systemstream: false\\lvideo/x-dirac\\limage/x-jpc\\lvideo/x-h264\\l stream-format: byte-stream\\l alignment: { (string)au, (strin… }\\lvideo/x-h265\\l stream-format: byte-stream\\l alignment: { (string)au, (strin… }\\laudio/mpeg\\l parsed: true\\l mpegversion: { (int)1, (int)2 }\\laudio/mpeg\\l framed: true\\l mpegversion: 4\\l stream-format: adts\\laudio/mpeg\\l mpegversion: 4\\l stream-format: raw\\laudio/x-lpcm\\l width: { (int)16, (int)20, … }\\l rate: { (int)48000, (int)9… }\\l channels: [ 1, 8 ]\\l dynamic_range: [ 0, 255 ]\\l emphasis: { (boolean)false, (b… }\\l mute: { (boolean)false, (b… }\\laudio/x-ac3\\l framed: true\\laudio/x-dts\\l framed: true\\laudio/x-opus\\l channels: [ 1, 8 ]\\l channel-mapping-family: { (int)0, (int)1 }\\lsubpicture/x-dvb\\lapplication/x-teletext\\lmeta/x-klv\\l parsed: true\\limage/x-jpc\\l profile: [ 0, 49151 ]\\l\”]\n subgraph cluster_nvvconv0_0x7fa0212110 {\n fontname=\“Bitstream Vera Sans\”;\n fontsize=\“8\”;\n style=\“filled,rounded\”;\n color=black;\n label=\“Gstnvvconv\\nnvvconv0\\n[0]\\nparent=(GstPipeline) pipe3\”;\n subgraph cluster_nvvconv0_0x7fa0212110_sink {\n label=\"\";\n style=\“invis\”;\n nvvconv0_0x7fa0212110_sink_0x7fa0026c00 [color=black, fillcolor=\"#aaaaff\", label=\“sink\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n subgraph cluster_nvvconv0_0x7fa0212110_src {\n label=\"\";\n style=\“invis\”;\n nvvconv0_0x7fa0212110_src_0x7fa0026e50 [color=black, fillcolor=\"#ffaaaa\", label=\“src\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n nvvconv0_0x7fa0212110_sink_0x7fa0026c00 -> nvvconv0_0x7fa0212110_src_0x7fa0026e50 [style=\“invis\”];\n fillcolor=\"#aaffaa\";\n }\n\n nvvconv0_0x7fa0212110_src_0x7fa0026e50 -> capsfilter1_0x7fa022c440_sink_0x7fa022e150 [labeldistance=\“10\”, labelangle=\“0\”, label=\" \", taillabel=\“video/x-raw(memory:NVMM)\\l format: { (string)I420, (str… }\\l width: [ 1, 2147483647 ]\\l height: [ 1, 2147483647 ]\\l framerate: [ 0/1, 2147483647/1 ]\\lvideo/x-raw\\l format: { (string)I420, (str… }\\l width: [ 1, 2147483647 ]\\l height: [ 1, 2147483647 ]\\l framerate: [ 0/1, 2147483647/1 ]\\l\”, headlabel=\“ANY\”]\n subgraph cluster_interpipesrc1_0x7fa0150c30 {\n fontname=\“Bitstream Vera Sans\”;\n fontsize=\“8\”;\n style=\“filled,rounded\”;\n color=black;\n label=\“GstInterPipeSrc\\ninterpipesrc1\\n[0]\\nparent=(GstPipeline) pipe3\\nemit-signals=FALSE\\nlisten-to=\\\“vid\\\”\”;\n subgraph cluster_interpipesrc1_0x7fa0150c30_src {\n label=\"\";\n style=\“invis\”;\n interpipesrc1_0x7fa0150c30_src_0x7fa00269b0 [color=black, fillcolor=\"#ffaaaa\", label=\“src\\n[-][bFb]\”, height=\“0.2\”, style=\“filled,solid\”];\n }\n\n fillcolor=\"#ffaaaa\";\n }\n\n interpipesrc1_0x7fa0150c30_src_0x7fa00269b0 -> capsfilter0_0x7fa022c100_sink_0x7fa00279e0 [label=\“ANY\”]\n}\n"",
“param” : {
“description” : “The pipeline graph on GraphViz dot format”,
“type” : “gchararray”,
“access” : “((GstdParamFlags) READ | 224)”
}
},
{
“name” : “verbose”,
“value” : false,
“param” : {
“description” : “Verbose state for the media stream pipeline”,
“type” : “gboolean”,
“access” : “((GstdParamFlags) READ | 2)”
}
}
]
}
}

Are you able to validate this pipeline using gst-launch-1.0 just to discard it is something related with pipeline tuning? Otherwise I would recommend sending an inquiry to support@ridgerun.com

Regards,
Fabian
www.ridgerun.com

Hi @fabian.solano

I was able to sort out the State error problem. The issue was the file location, It must be changed to a location other than the root.

Now the pipeline starts but a 0 byte file is created, What can be the problem for this?

Thank You

Hi @fabian.solano

I tried the below pipelines

gst-client pipeline_create pipe1 v4l2src device=/dev/video0 ! interpipesink name=vid sync=false

gst-client pipeline_create pipe2 interpipesrc listen-to=vid ! xvimagesink sync=false

gst-client pipeline_create pipe3 interpipesrc listen-to=vid ! ‘video/x-raw, framerate=30/1, format=YUY2’ ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! omxh264enc ! mpegtsmux name=mux ! filesink location=/tmp/cam.ts sync=false

And started the pipeleines in the following order

gst-client pipeline_play pipe1
gst-client pipeline_play pipe2
gst-client pipeline_play pipe3

pipe3(file recording) state wont change to PLAYING state its always on the READY state and a file is made which is 0bytes

Thank You

Hi @fabian.solano,

I am trying a picture in picture pipeline recording

gst-client pipeline_create pipe5 interpipesrc name=src format=time listen-to=ch1 accept-events=false accept-eos-events=false enable-sync=false caps="video/x-raw, format=(string)YUY2, width=(int)1920, height=(int)1080, framerate=(fraction)30/1" ! comp.sink_0 interpipesrc name=src2 format=time listen-to=ch2 accept-events=false accept-eos-events=false enable-sync=false caps="video/x-raw, framerate=(fraction)30/1" ! comp.sink_1 nvcompositor name=comp sink_0::xpos=0 sink_0::ypos=0 sink_1::xpos=1500 sink_1::ypos=800 sink_1::width=320 sink_1::height=240 ! nvvidconv ! nvv4l2h264enc maxperf-enable=1 bitrate=4000000 profile=4 ! h264parse ! mux. pulsesrc device="alsa_input.usb-VXIS_Inc_ezcap_U3_capture-02.analog-stereo" ! 'audio/x-raw, format=(string)S16LE, layout=(string)interleaved, rate=44100, channels=(int)2' ! queue ! audioconvert ! voaacenc ! aacparse ! mpegtsmux name=mux ! filesink location="/tmp/pipinters.ts"

{
“code” : 2,
“description” : “Bad pipeline description”,
“response” : null
}

I get the above eroor when executed the gstd command. What can be the reason?

Hi @pulzappcheck890,

The

Error is related with syntax error or a pipeline that cannot be build due missing elements or incorrect parameters. Please always try to verify your pipeline with gst-launch, and after that use it with gstd.

Regards,

Fabian
www.ridgerun.com

1 Like

Hi @fabian.solano

I was able to sort it out the problem was incorrect parameters used. Thank you for your help

I am trying to do the following tasks using interpipe and jetson nano.

I have 2 video sources usb/rtsp
Below are my tasks that has to run simultaneously

  1. 1080p PIP recording to usb disk
  2. 1080p PIP stream rtmp
  3. Preview PIP in a small window(144x80)
  4. Preview video source 1 in a small window(144x80)
  5. Preview video source 2 in a small window(144x80)

All 1 2 3 4 and 5 have to run together

Will I be able to do this using interpipes with gstd deamon in jetson nano?

Hi @pulzappcheck890,

Yes you will be able to use gstd daemon and interpipes in that environment.

Regards,
Fabian

Thank you @fabian.solano

I’ve tried rtmpsink but when the rtmplink fails the device ram starts to accumulate until it overflow within seconds.

Is there a method to stop buffers going into ram when using interpipes?