Need help with gst pipeline

Please provide complete information as applicable to your setup.

• Hardware Platform (RTX 2080 ti)
• Operating System (Ubuntu 20.04)
• NVIDIA GPU Driver Version (455.38)
• Issue Type(question)

Hi folks!

I am trying to build a gstreamer pipeline that starts with ‘filesrc’ (because the program I am building would take an rtsp stream or elementary h264 stream) and ends with ‘appsink’ (because I want access to the buffer so I can operate on each frame on the fly). This is what I have been able to come up with so far: ‘gst-launch-1.0 filesrc location=sample.264 ! decodebin ! videoconvert ! appsink wait-on-eos=false drop=true max-buffers=60’; it plays the file for the duration, then frees the pipeline okay, but there is no display. When I implement the same pipeline as a python script, I get the following error:

Error: gst-stream-error-quark: Internal data stream error. (1): gstbaseparse.c(3634): gst_base_parse_loop (): /GstPipeline:pipeline0/GstDecodeBin:decoder/GstH264Parse:h264parse0:
streaming stopped, reason not-linked (-1)

I am really confused here, is there something wrong with my pipeline or the python implementation? Your help would be much appreciated. Here is my python script:

****************************************
import sys
import gi
gi.require_version('Gst', '1.0')
from gi.repository import GObject, Gst, GLib

def bus_call(bus, message, loop):
    t = message.type
    if t == Gst.MessageType.EOS:
        sys.stdout.write("End-of-stream\n")
        loop.quit()
    elif t==Gst.MessageType.WARNING:
        err, debug = message.parse_warning()
        sys.stderr.write("Warning: %s: %s\n" % (err, debug))
    elif t == Gst.MessageType.ERROR:
        err, debug = message.parse_error()
        sys.stderr.write("Error: %s: %s\n" % (err, debug))
        loop.quit()
    return True

def main(args):
	if len(args) != 2:
		sys.stderr.write("usage: %s <media file or uri>\n" % args[0])
		sys.exit(1)
		
	# ~ GObject.threads_init()
	Gst.init(None)
	
	print("Creating Pipeline \n ")
	pipeline = Gst.Pipeline()
	
	if not pipeline:
		sys.stderr.write(" Unable to create Pipeline \n")
		
	print("Creating Source \n ")
	source = Gst.ElementFactory.make("filesrc", "file-source")
	if not source:
		sys.stderr.write(" Unable to create Source \n")
	
	print("Creating Decoder \n")
	decoder = Gst.ElementFactory.make("decodebin", "decoder")
	if not decoder:	
		sys.stderr.write(" Unable to create Decoder \n")
	
	print("Creating Video Converter \n")
	converter = Gst.ElementFactory.make("videoconvert", "converter")
	if not converter:	
		sys.stderr.write(" Unable to create Video Converter \n")
		
	print("Creating AppSink \n ")
	sink = Gst.ElementFactory.make("appsink", 'NULL')
	if not sink:
		sys.stderr.write(" Unable to create AppSink \n")
	
	print("Playing file %s " %args[1])
	source.set_property('location', args[1])
	
	sink.set_property('wait-on-eos', False)
	sink.set_property('drop', True)
	sink.set_property('max-buffers', 60)
	
	caps = Gst.caps_from_string("video/x-raw, format=(string){BGR, GRAY8}; video/x-bayer,format=(string){rggb,bggr,grbg,gbrg}")
	sink.set_property("caps", caps)
	
	print("Adding elements to Pipeline \n")
	pipeline.add(source)
	pipeline.add(decoder)
	pipeline.add(converter)
	pipeline.add(sink)
	
	print("Linking elements in the Pipeline \n")
	source.link(decoder)
	decoder.link(converter)
	converter.link(sink)
	
	# ~ loop = GObject.MainLoop()
	loop = GLib.MainLoop()
	
	bus = pipeline.get_bus()
	bus.add_signal_watch()
	bus.connect ("message", bus_call, loop)
	
	print("Starting pipeline \n")
	
	pipeline.set_state(Gst.State.PLAYING)
	try:
		loop.run()
	except:
		pass
		
	pipeline.set_state(Gst.State.NULL)
	
if __name__ == '__main__':
	sys.exit(main(sys.argv))
****************************************

The pipeline you list can not work.

You may try:

gst-launch-1.0 --gst-debug=v4l2videodec:7 filesrc location=/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_720p.h264 ! h264parse ! nvv4l2decoder num-extra-surfaces=1 ! queue ! nvvideoconvert ! appsink wait-on-eos=false drop=true max-buffers=1 enable-last-sample=false eos=false sync=0 async=0

The python script failure is caused by the wrong caps setting. In JetPack and DeepStream system, decodebin will use HW decoder, while hardware decoder does not support the caps you set.
https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvvideo4linux2.html

‘nvv4l12decoder’ and ‘nvvideoconvert’ are not installed on my system for some reason (I checked using ‘gst-inspect-1.0’). I installed the plugins using the following commands.

sudo add-apt-repository universe
sudo add-apt-repository multiverse
sudo apt-get update
sudo apt-get install gstreamer1.0-tools gstreamer1.0-alsa \
  gstreamer1.0-plugins-base gstreamer1.0-plugins-good \
  gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly \
  gstreamer1.0-libav
sudo apt-get install libgstreamer1.0-dev \
  libgstreamer-plugins-base1.0-dev \
  libgstreamer-plugins-good1.0-dev \
  libgstreamer-plugins-bad1.0-dev

How do I install the missing plugins?

I used ‘decodebin’ and ‘autovideoconvert’ for the missing plugins and got the following error message on the console:

Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
Got context from element 'glcolorconvertelement0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayX11\)\ gldisplayx11-0";
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
Got EOS from element "pipeline0".
Execution ended after 0:00:00.120029357
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Xlib:  extension "NV-GLX" missing on display ":1".

Is there a way to resolve this?

You need to install DeepStreamSDK as the instruction of the document. Quickstart Guide — DeepStream 6.1.1 Release documentation