How to use appsrc to feed the deepstream pipeline

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) AGX Xavier
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only) 4.4

I use /opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/dee
pstream-test1/deepstream_test1_app.c as template to modify the source from “filesrc” to “appsrc” to feed into the same pipeline that is


Question: how do we setup the cap for appsrc for this pipeline?

I try the following setup but it doesn’t seem working:

   appsrc = gst_element_factory_make ("appsrc", "app-source");
    g_object_set (G_OBJECT (appsrc), "caps",
        		gst_caps_new_simple ("video/x-raw",
      				     "format", G_TYPE_STRING, "RGBA",
      				     "width", G_TYPE_INT, 1920,
      				     "height", G_TYPE_INT, 1080,
      				     "framerate", GST_TYPE_FRACTION, 30, 1,
      				     NULL), NULL);

How to setup the appsrc cap correctly? What should be the “format” of the cap for this pipeline? What am I missing? Please advise. Thank you.

The sample of deepstream-appsrc-test is for appsrc sample.

It depends on the data you want to send with appsrc appsrc. According to your pipeline, the easiest way is to run “gst-inspect-1.0 h264parse” to know what h264parse need for sink pad. So you can set the caps as the same to h264parse sink pad.

These are basic gstreamer concept and knowledge. Caps, Caps negotiation ,Pads and capabilities , please be familiar with gstreamer before you start your deepstream development job.

I checked gst-inspect-1.0 nvstreammux and it takes (sink caps):

format: { (string)NV12, (string)RGBA }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]

Since my appsrc read frame from a.h264 through opencv and convert to RGBA, so I shorten the pipeline to appsrc!nvstreamux!nvinfer!nvvideoconvert!nvdsosd!nvegltransform!nveglglessink
and set the appsrc (src) caps as:

g_object_set (G_OBJECT (appsrc), “caps”,
gst_caps_new_simple (“video/x-raw”,
“format”, G_TYPE_STRING, “RGBA”,
“width”, G_TYPE_INT, 1920,
“height”, G_TYPE_INT, 1080,
“framerate”, GST_TYPE_FRACTION, 30, 1,

I can compile the new code without error.

But when I run it and it complains:

Elements could not be linked: 2. Exiting.

If look at the code, it happens at:

if (!gst_element_link_many (appsrc, streammux, pgie,
nvvidconv, nvosd, sink, NULL)) {
g_printerr (“Elements could not be linked: 2. Exiting.\n”);
return -1;

So I believe the appsrc src cap now is matching with nvstreammux sink cap, why they cannot be linked at runtime? Anything else needed for setting up nvstreammux pipeline?

Any suggestion would be appreciated. Thank you.

one step closer:

I add a nvvidconv in between appsrc and streammux and solve the “Elements could not be linked” issue. However, when run the pipeline, it complains:

ERROR from element stream-muxer: Input buffer number of surfaces (-198919072) must be equal to mux->num_surfaces_per_frame (1)
Set nvstreammux property num-surfaces-per-frame appropriately

Question: how to set num_surfaces_per_frame at nvvideoconvert src to be 1?

I check gst-inspect-1.0 nvvideoconvert and there is no num-surfaces-per-frame property. The closest two are: output-buffers and nvbuf-memory-type.

Your help will be appreciated. Thanks.

found the solution!

1 Like