Erroneous pipeline: could not link nvarguscamerasrc0 to rawvideoparse0

Hi,

I’m trying to stream a video capture from my camera on a Jetson Nano to a file. I’m using this pipeline:
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=640, height=480, format=(string)RGBA, framerate=(fraction)30/1' ! rawvideoparse use-sink-caps=true ! filesink location=capture.raw_rgba

But, it returns this error:
WARNING: erroneous pipeline: could not link nvarguscamerasrc0 to rawvideoparse0, neither element can handle caps video/x-raw(memory:NVMM), width=(int)640, height=(int)480, format=(string)RGBA, framerate=(fraction)30/1

I saw this post and tried adding bufapi-version=1 but that didn’t work and I also tried adding a connection to mx.sink_0 based on this reply to the post like so:
gst-launch-1.0 nvarguscamerasrc bufapi-version=1 ! 'video/x-raw(memory:NVMM), width=640, height=480, format=(string)RGBA, framerate=(fraction)30/1' ! mx.sink_0 nvstreammux width=640 height=480 batch-size=1 name=mx ! rawvideoparse use-sink-caps=true ! filesink location=capture.raw_rgba

But, that didn’t work either.

Any suggestions on how I should proceeed?

The pipeline works only when there’s no rawvideoparse element and the format is NV12 i.e. the following works:

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=640, height=480, format=(string)NV12, framerate=(fraction)30/1'! filesink location=capture.raw_rgba

Can someone explain to me why that is? I’m new to the gstreamer api. Any links to good documentatio/examples/tutorials would be greatly appreciated!

Hi,
You can use nvvidconv plugin to convert NVMM buffers to CPU buffers, and save to a file. Please try

gst-launch-1.0 nvarguscamerasrc num-buffers=10 ! 'video/x-raw(memory:NVMM), width=640, height=480, format=(string)NV12, framerate=(fraction)30/1'! nvvidconv ! video/x-raw,format=RGBA ! filesink location=capture.raw_rgba