Grab video using nvarguscamerasrc

Hey,
I have a Xavier AGX with two cameras:

ll /dev/video*
crw-rw----+ 1 root video 81, 0 Feb 15 19:03 /dev/video0
crw-rw----+ 1 root video 81, 2 Feb 15 19:03 /dev/video1

I succeed to grab video using v4l2src:

gst-launch-1.0 v4l2src device=/dev/video1 ! videoconvert ! queue ! ximagesink sync=false -vvv

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)BGRx, width=(int)4112, height=(int)3008, framerate=(fraction)13/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, format=(string)BGRx, width=(int)4112, height=(int)3008, framerate=(fraction)13/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)4112, height=(int)3008, framerate=(fraction)13/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)4112, height=(int)3008, framerate=(fraction)13/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)4112, height=(int)3008, framerate=(fraction)13/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstXImageSink:ximagesink0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)4112, height=(int)3008, framerate=(fraction)13/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, interlace-mode=(string)progressive

The framerate is 13 fps (4112x3008).
I would like to increase the framerate (the specification described 26 fps)
While I am trying to use nvarguscamerasrc if receive a black frame at 30 fps (1920x1080):

gst-launch-1.0 nvarguscamerasrc sensor-id=1 ! nvvidconv ! xvimagesink -vvv

nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)I420
/GstPipeline:pipeline0/GstXvImageSink:xvimagesink0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)I420
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1

I am not familiar with gstreamer, what am I doing wrong?
what is the best way to save 4K video at the maximal frame rate?

What’s the sensor type?

v4l2-ctl --list-formats-ext

nvidia@nvidia:~$ v4l2-ctl -d /dev/video1 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
	Index       : 0
	Type        : Video Capture
	Pixel Format: 'TP31'
	Name        : 0x31 MIPI DATATYPE
		Size: Discrete 4112x3008
			Interval: Discrete 0.073s (13.722 fps)

	Index       : 1
	Type        : Video Capture
	Pixel Format: 'RGGB'
	Name        : 8-bit Bayer RGRG/GBGB
		Size: Discrete 4112x3008
			Interval: Discrete 0.073s (13.722 fps)

	Index       : 2
	Type        : Video Capture
	Pixel Format: 'RG16'
	Name        : 16-bit Bayer RGRG/GBGB (Exp.)
		Size: Discrete 4112x3008
			Interval: Discrete 0.073s (13.722 fps)

	Index       : 3
	Type        : Video Capture
	Pixel Format: 'RG16'
	Name        : 16-bit Bayer RGRG/GBGB (Exp.)
		Size: Discrete 4112x3008
			Interval: Discrete 0.073s (13.722 fps)

	Index       : 4
	Type        : Video Capture
	Pixel Format: 'BX24'
	Name        : 32-bit XRGB 8-8-8-8
		Size: Discrete 4112x3008
			Interval: Discrete 0.073s (13.722 fps)

	Index       : 5
	Type        : Video Capture
	Pixel Format: 'XR24'
	Name        : 32-bit BGRX 8-8-8-8
		Size: Discrete 4112x3008
			Interval: Discrete 0.073s (13.722 fps)

	Index       : 6
	Type        : Video Capture
	Pixel Format: 'VYUY'
	Name        : VYUY 4:2:2
		Size: Discrete 4112x3008
			Interval: Discrete 0.073s (13.722 fps)

############################################
I have tried to use a empty pipeline to see if frames arriving from the camera and no frames arriving…

nvidia@nvidia:~$ GST_DEBUG="GST_TRACER:10" GST_TRACERS="latency(flags=pipeline+element+reported)" gst-launch-1.0 nvarguscamerasrc sensor-id=1 ! fakesink
0:00:00.049104355  9194   0x55b1b88240 DEBUG             GST_TRACER gsttracer.c:164:gst_tracer_register:<registry0> update existing feature 0x55b1957cc0 (latency)
0:00:00.049241093  9194   0x55b1b88240 DEBUG             GST_TRACER gsttracer.c:164:gst_tracer_register:<registry0> update existing feature 0x55b1957c00 (log)
0:00:00.049300550  9194   0x55b1b88240 DEBUG             GST_TRACER gsttracer.c:164:gst_tracer_register:<registry0> update existing feature 0x55b1957b40 (rusage)
0:00:00.049339558  9194   0x55b1b88240 DEBUG             GST_TRACER gsttracer.c:164:gst_tracer_register:<registry0> update existing feature 0x55b1957a80 (stats)
0:00:00.049378311  9194   0x55b1b88240 DEBUG             GST_TRACER gsttracer.c:164:gst_tracer_register:<registry0> update existing feature 0x55b19579c0 (leaks)
0:00:00.049559562  9194   0x55b1b88240 TRACE             GST_TRACER gsttracerrecord.c:111:gst_tracer_record_build_format: latency.class, src=(structure)"scope\,\ type\=\(type\)gchararray\,\ related-to\=\(GstTracerValueScope\)GST_TRACER_VALUE_SCOPE_PAD\;", sink=(structure)"scope\,\ type\=\(type\)gchararray\,\ related-to\=\(GstTracerValueScope\)GST_TRACER_VALUE_SCOPE_PAD\;", time=(structure)"value\,\ type\=\(type\)guint64\,\ description\=\(string\)\"time\\\ it\\\ took\\\ for\\\ the\\\ buffer\\\ to\\\ go\\\ from\\\ src\\\ to\\\ sink\\\ ns\"\,\ min\=\(guint64\)0\,\ max\=\(guint64\)18446744073709551615\;", ts=(structure)"value\,\ type\=\(type\)guint64\,\ description\=\(string\)\"ts\\\ when\\\ the\\\ latency\\\ has\\\ been\\\ logged\"\,\ min\=\(guint64\)0\,\ max\=\(guint64\)18446744073709551615\;";
0:00:00.049647595  9194   0x55b1b88240 DEBUG             GST_TRACER gsttracerrecord.c:125:gst_tracer_record_build_format: new format string: latency, src=(string)%s, sink=(string)%s, time=(guint64)%lu, ts=(guint64)%lu;
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:03.969152062
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
^C
nvidia@nvidia:~$ GST_DEBUG="GST_TRACER:10" GST_TRACERS="latency(flags=pipeline+element+reported)" gst-launch-1.0 v4l2src device=/dev/video1 ! fakesink
0:00:00.047472133  9277   0x55d1a1b240 DEBUG             GST_TRACER gsttracer.c:164:gst_tracer_register:<registry0> update existing feature 0x55d17eacc0 (latency)
0:00:00.047597415  9277   0x55d1a1b240 DEBUG             GST_TRACER gsttracer.c:164:gst_tracer_register:<registry0> update existing feature 0x55d17eac00 (log)
0:00:00.047640679  9277   0x55d1a1b240 DEBUG             GST_TRACER gsttracer.c:164:gst_tracer_register:<registry0> update existing feature 0x55d17eab40 (rusage)
0:00:00.047675336  9277   0x55d1a1b240 DEBUG             GST_TRACER gsttracer.c:164:gst_tracer_register:<registry0> update existing feature 0x55d17eaa80 (stats)
0:00:00.047709224  9277   0x55d1a1b240 DEBUG             GST_TRACER gsttracer.c:164:gst_tracer_register:<registry0> update existing feature 0x55d17ea9c0 (leaks)
0:00:00.047881898  9277   0x55d1a1b240 TRACE             GST_TRACER gsttracerrecord.c:111:gst_tracer_record_build_format: latency.class, src=(structure)"scope\,\ type\=\(type\)gchararray\,\ related-to\=\(GstTracerValueScope\)GST_TRACER_VALUE_SCOPE_PAD\;", sink=(structure)"scope\,\ type\=\(type\)gchararray\,\ related-to\=\(GstTracerValueScope\)GST_TRACER_VALUE_SCOPE_PAD\;", time=(structure)"value\,\ type\=\(type\)guint64\,\ description\=\(string\)\"time\\\ it\\\ took\\\ for\\\ the\\\ buffer\\\ to\\\ go\\\ from\\\ src\\\ to\\\ sink\\\ ns\"\,\ min\=\(guint64\)0\,\ max\=\(guint64\)18446744073709551615\;", ts=(structure)"value\,\ type\=\(type\)guint64\,\ description\=\(string\)\"ts\\\ when\\\ the\\\ latency\\\ has\\\ been\\\ logged\"\,\ min\=\(guint64\)0\,\ max\=\(guint64\)18446744073709551615\;";
0:00:00.047923307  9277   0x55d1a1b240 DEBUG             GST_TRACER gsttracerrecord.c:125:gst_tracer_record_build_format: new format string: latency, src=(string)%s, sink=(string)%s, time=(guint64)%lu, ts=(guint64)%lu;
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.239868059  9277   0x55d1a379e0 TRACE             GST_TRACER :0:: latency, src=(string)v4l2src0_src, sink=(string)fakesink0_sink, time=(guint64)0, ts=(guint64)239807450;
0:00:00.316766664  9277   0x55d1a379e0 TRACE             GST_TRACER :0:: latency, src=(string)v4l2src0_src, sink=(string)fakesink0_sink, time=(guint64)0, ts=(guint64)316696455;
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:00.274203516
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

It report too many pixel format?

what do you mean?
it shows 7 types of pixel formats.

Why need report 7 types of pixel formats. For the nvarguscamerasrc only support 10/12/14 bit bayer formats. Looks like there’s no any of these supported formats.

So there is no way to achieve the fps described at the specification (26FPS)?
https://www.alliedvision.com/en/products/cameras/detail/Alvium%201800%20C/-2050/action/pdf.html

nvidia@nvidia:~$ v4l2-ctl --all
Driver Info (not using libv4l2):
	Driver name   : avt_tegra_csi2
	Card type     : ALVIUM 1800 C-2050c 2-3c
	Bus info      : platform:15c10000.vi:0
	Driver version: 4.9.140
	Capabilities  : 0x85200001
		Video Capture
		Read/Write
		Streaming
		Extended Pix Format
		Device Capabilities
	Device Caps   : 0x05200001
		Video Capture
		Read/Write
		Streaming
		Extended Pix Format
Priority: 2
Video input : 0 (Camera 0: ok)
Format Video Capture:
	Width/Height      : 5376/3672
	Pixel Format      : 'BX24'
	Field             : None
	Bytes per Line    : 21504
	Size Image        : 78962688
	Colorspace        : sRGB
	Transfer Function : Default (maps to sRGB)
	YCbCr/HSV Encoding: Default (maps to ITU-R 601)
	Quantization      : Default (maps to Full Range)
	Flags             : 
Crop Capability Video Capture:
	Bounds      : Left 0, Top 0, Width 5496, Height 3672
	Default     : Left 0, Top 0, Width 5496, Height 3672
	Pixel Aspect: 1/1
Crop: Left 0, Top 0, Width 5376, Height 3672
Selection: crop, Left 0, Top 0, Width 5376, Height 3672
Selection: crop_default, Left 0, Top 0, Width 5496, Height 3672
Selection: crop_bounds, Left 0, Top 0, Width 5496, Height 3672
Selection: native_size, Left 0, Top 0, Width 5496, Height 3672
Streaming Parameters Video Capture:
	Capabilities     : timeperframe
	Frames per second: 2.113 (2113/1000)
	Read buffers     : 0

There is another camera source that can i use?

Your device looks like not sensor. Usually normal sensor only support one or two similar pixel format.
If you need argus you need to check if this sensor can support 10/12 bit bayer output then modify the driver for it.

I would like to validate that v4l2src is not the element that limit the frame rate to 13FPS, so I would like to check if another video source can grab faster.
I found nvarguscamerasrc that hardware accelerated by nvidia, so i wanted to check the performance of this grabber.
there are other grabber that you suggest me to check?