Cannot link Streammux to Capsfilter and Nvinfer

I am trying to create this pipeline:
appsrc(rgb jpeg) -> nvvidconv -> capsfilter(video/x-raw(memory:NVMM), format=(string)RGBA) -> nvstreammux -> nvinfer -> nvjpegenc -> appsink.

But for some reason I cannot link streammux to capsfilter and nvinfer.

Below is the code:

pipelineData.pipeline = gst_pipeline_new ("rtsp-pipeline");
pipelineData.source = gst_element_factory_make ("appsrc", "source");
pipelineData.convert = gst_element_factory_make ("nvvidconv", "videoconvert");
pipelineData.capsfilter = gst_element_factory_make ("capsfilter", "capsfilter");
pipelineData.nvstreammux = gst_element_factory_make ("nvstreammux", "nvstreammux");
pipelineData.nvinfer = gst_element_factory_make ("nvinfer", "nvinfer");
pipelineData.jpegenc = gst_element_factory_make ("nvjpegenc", "nvjpegenc");
pipelineData.sink = gst_element_factory_make ("appsink", "sink");

if (!pipelineData.source || !pipelineData.convert || !pipelineData.capsfilter || 
!pipelineData.nvstreammux || !pipelineData.nvinfer || !pipelineData.jpegenc || !pipelineData.sink) {
	std::cout << " Pipelines could not be created " << std::endl;
	terminate = TRUE;
}

GstCaps *caps = gst_caps_from_string(“video/x-raw(memory:NVMM), format=(string)RGBA”);
g_object_set (G_OBJECT (pipelineData.capsfilter), “caps”, caps, NULL);

gst_bin_add_many (GST_BIN (pipelineData.pipeline), pipelineData.source , pipelineData.convert, pipelineData.capsfilter, pipelineData.nvstreammux, pipelineData.nvinfer, pipelineData.jpegenc, pipelineData.sink, NULL);
if (!gst_element_link_many (pipelineData.source, pipelineData.convert, pipelineData.capsfilter, NULL)) {
std::cout << “Elements could not be linked.” <<std::endl;
gst_object_unref (pipelineData.pipeline);
terminate = TRUE;
}
if (!gst_element_link_many (pipelineData.nvstreammux, pipelineData.nvinfer, pipelineData.jpegenc, pipelineData.sink, NULL)) {
std::cout << “Elements could not be linked.” <<std::endl;
gst_object_unref (pipelineData.pipeline);
terminate = TRUE;
}

GstPad *streamk_pad = gst_element_get_request_pad (pipelineData.nvstreammux, “sink_0”);
if (!streamk_pad) {
std::cout <<“Streammux request sink pad failed. Exiting.\n”<<std::endl;
terminate = TRUE;
}

  GstPad *caps_pad = gst_element_get_static_pad (pipelineData.capsfilter, "src");
if (!sink_pad) {
	std::cout << "sink_pad request sink pad failed. Exiting.\n"<<std::endl;
	terminate = TRUE;
  }
if (gst_pad_link (caps_pad, streamk_pad) != GST_PAD_LINK_OK) { // Fails here
	std::cout << "Failed to link caps_pad to stream muxer. Exiting.\n"<<std::endl;
	terminate = TRUE;
    }

Any help will be appreciated.

I also tried
g_signal_connect (pipelineData.capsfilter, “pad-added”, G_CALLBACK (pad_added_handler), &pipelineData);
but it never gets fired.

Could you help to set GST_DEBUG=3 and share me with the log?

Below are the logs.

Pipeline state changed from NULL to READY:
0:00:00.151996590 18355 0x7f982f3370 FIXME default gstutils.c:3981:gst_pad_create_stream_id_internal:source:src Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:00.152177168 18355 0x7f982f3370 FIXME videodecoder gstvideodecoder.c:933:gst_video_decoder_drain_out: Sub-class should implement drain()
ApSrc needs data!!!
Pushing Data.
Pushed buffer to appsrc
0:00:15.018156897 18355 0x7f982f3370 WARN videodecoder gstvideodecoder.c:2443:gst_video_decoder_chain: Received buffer without a new-segment. Assuming timestamps start from 0.
0:00:15.018229972 18355 0x7f982f3370 FIXME videodecoder gstvideodecoder.c:933:gst_video_decoder_drain_out: Sub-class should implement drain()
0:00:15.025113152 18355 0x7f982f3370 WARN basetransform gstbasetransform.c:1355:gst_base_transform_setcaps: transform could not transform video/x-raw(memory:NVMM), width=(int)720, height=(int)576, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)RGBA in anything we support
0:00:15.025306750 18355 0x7f982f3370 WARN basetransform gstbasetransform.c:1355:gst_base_transform_setcaps: transform could not transform video/x-raw(memory:NVMM), width=(int)720, height=(int)576, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)RGBA in anything we support
0:00:15.025339355 18355 0x7f982f3370 WARN GST_PADS gstpad.c:4226:gst_pad_peer_query:videoconvert:src could not send sticky events
0:00:15.027595866 18355 0x7f982f3370 WARN basetransform gstbasetransform.c:1355:gst_base_transform_setcaps: transform could not transform video/x-raw(memory:NVMM), width=(int)720, height=(int)576, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)RGBA in anything we support
0:00:15.065020638 18355 0x7f982f3370 ERROR nvvideoconvert gstnvvideoconvert.c:2551:gst_nvvideoconvert_transform: Input buffer is not NvBufSurface
0:00:15.065075535 18355 0x7f982f3370 ERROR nvvideoconvert gstnvvideoconvert.c:2930:gst_nvvideoconvert_transform: buffer transform failed
0:00:15.065148453 18355 0x7f982f3370 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop: error: Internal data stream error.
0:00:15.065176579 18355 0x7f982f3370 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop: error: streaming stopped, reason error (-5)
Gst Message Error!!!
Exiting from pipeline!!!
0:00:15.065672945 18355 0x7f982f3370 WARN basetransform gstbasetransform.c:1355:gst_base_transform_setcaps: transform could not transform video/x-raw(memory:NVMM), width=(int)720, height=(int)576, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)RGBA in anything we support
0:00:15.065865918 18355 0x7f982f3370 WARN basetransform gstbasetransform.c:1355:gst_base_transform_setcaps: transform could not transform video/x-raw(memory:NVMM), width=(int)720, height=(int)576, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)RGBA in anything we support

(node:18355): GLib-CRITICAL **: 11:13:36.496: g_source_remove: assertion ‘tag > 0’ failed

Please take note of :
Input buffer is not NvBufSurface

The input is a jpeg image.

which platform are u using?
Is it possible to replace appsrc with uridecodebin and give a try?

I am using the jetson nano TX1. With uridecodebin it works but that would not work for my use case.

I am creating a binding for nodejs. So there will be multiple pipelines.i.e

uridecodebin[…n]-> nvconvert -> jpegenc -> appsink

Feed all to nvinfer, etc

next pipeline

appsrc(rgb jpeg) -> nvvidconv -> capsfilter(video/x-raw(memory:NVMM), format=(string)RGBA) -> nvstreammux -> nvinfer -> nvjpegenc -> appsink.

“appsrc(rgb jpeg)”
Did you decode jpeg image to be RGB format in appsrc ?

Please use nvvideoconvert in deepstream pipeline, but not use nvvidconv which does not have deepstream flag.

1 Like

The jpeg image is in rgb. Besides nvvidconv should do the conversion if not.

Is it possible for you to do a working pipeline, even if its in gst launch that uses appsrc and nvinfer?

Is there no example of running nvinfer through appsrc? I cannot seem to be able to solve this problem.

I made a pipeline uridecodebin->nvstreammux->nvinfer->appsink got the graph and noticed nvv4l2decoder before streammux. So I changed the pipeline and it works.

New Pipeline:
appsrc->nvv4l2decoder->streammux->nvinfer->appsink.