How to use the v4l2src

My CMOS sensor is SC130GS and the output data is the raw10 Y data,1280*1024,60fps。
The sensor work well,and I can see the image from the operation below:
nvidia@tegra-ubuntu:~$ gst-launch-1.0 nvcamerasrc fpsRange=“6.0 60.0” sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=(int)1280, height=(int)1024, format=(string)I420, framerate=(fraction)60/1’ ! nvegltransform ! nveglglessink -ev
But when I use the operation as follows,it happened the problem:
nvidia@tegra-ubuntu:~$ gst-launch-1.0 v4l2src device=“/dev/video0” ! video/x-raw,width=1280,height=1024,format=GRAY16_BE,framerate=60/1 ! xvimagesink
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:00.000147897
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

So,I want to konw ,how can I slove the problem.(the version of gstreamer is 1.14)

nvcamerasrc uses a special path through ISP able to convert into several formats such as I420.

v4l2src only uses V4L interface. Check formats/modes that your driver provides:

v4l2-ctl -d /dev/video0 --list-formats-ext

(v4l2-ctl is provided by package v4l-utils)
You would first try one of the listed formats/modes.

Hi @huangsaishuai

Your pipeline fail due to caps negotiation error. Please check the supported formats of your sensor driver and the capabilities of the elements involved in the pipeline.

Please run the following pipeline with verbose option (-v) to check the negotiated caps:

gst-launch-1.0 -v v4l2src device="/dev/video0" ! video/x-raw,width=1280,height=1024,format=GRAY16_BE,framerate=60/1 ! xvimagesink

I suspect that xvimagesink does not handle GRAY16_BE format, so you will need to add a format conversion to one format that is supported to xvimagesink. Try the following pipeline:

gst-launch-1.0 -v v4l2src device="/dev/video0" ! video/x-raw,width=1280,height=1024,format=GRAY16_BE,framerate=60/1 ! videoconvert ! xvimagesink

Keep in mind that if your sensor provides Bayer RAW format, you must use nvcamersrc, nvarguscamerasrc or libargus to capture from the sensor and go through the ISP unit to perform the debayering process. The v4l2src element bypasses the ISP unit, so you will grab the frames just as the camera sensor provides them.

@Honey_Patouceul
I have the command you suggest,the result as follows:

nvidia@tegra-ubuntu:~$ v4l2-ctl -d /dev/video0 --list-formats-extioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: ‘BG10’
Name : 10-bit Bayer BGBG/GRGR
Size: Discrete 1280x1024
Interval: Discrete 0.017s (60.000 fps)

nvidia@tegra-ubuntu:~$ gst-launch-1.0 v4l2src device=“/dev/video0” ! video/x-raw,width=1280,height=1024,format=BG10,framerate=60/1 ! xvimagesink
WARNING: erroneous pipeline: could not link v4l2src0 to xvimagesink0
the error still exist,what should I do ?

Duplicate. Answered here.

My CMOS sensor is SC130GS(mono CMOS),the output is the raw-10 Y value data。

In the CMOS driver,the config message as follows:
Pixel Format:BG10
Name:10-bit bayer BGBG/GRGR

And the v4l2 code configuration as follows:
fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
fmt.fmt.pix.width = IMAGE_WIDTH;//320;
fmt.fmt.pix.height = IMAGE_HEIGHT;//240;
fmt.fmt.pix.pixelformat = V4L2_PIX_FMT_SBGGR10;
fmt.fmt.pix.field = V4L2_FIELD_INTERLACED;

My question is:why the image data I get from the memory is Bayer,in fact,the CMOS’s output is Y value only.Does the v4l2 have some conversion from Y to Bayer? Or the fmt.fmt.pix.pixelformat which I set is not correct? Or something else?

I’d say because your driver reports so. Probably a way to be handled by nvcamerasrc, but indeed the color interpretation would be wrong for a monochrome sensor.

You may say further how this sensor is connected to TX2 (CSI, USB, …) and what driver, firmware or device-tree patch you have installed.

Probably V4L should report something like Y10 format for your sensor. AFAIU it should be progressive, not interlaced, check that, too. You may try to add such format to your driver.

@Honey_Patouceul
I have got the raw data through the v4l2src,but a new problem happened.
The output data of the sensoe is the raw-10 Y value,which the value is from 0–1023.
In the drver code,the format of the configuration is BG10,as a result ,the value of the data from v4l2src is not from 0-1023,it feels that multiple some value(larger than 1023)
Do you konw the problem?

Hi @huangsaishuai.

You got a mismatch between the format configured in the sensor driver and the real format that the sensor provides. You have to modify your driver in a way that it reports the real format that the sensor provides, in this case: RAW-10 Y.

V4l2 will prepare the memory layout to contain the captured frames for the format specified in the driver, so, if you have a mismatch here, v4l2 will fill the frames with the incoming data but in a wrong format.

1 Like