I have a camera streaming images in RAW10 format. It works when capturing images via v4l2-ctl, but there’s no way to make it work with gst-launch-1.0 v4l2src. This is how it looks
nvidia@nvidia-desktop:~/Pictures$ v4l2-ctl -d /dev/video0 --list-formats-ext ioctl: VIDIOC_ENUM_FMT Index : 0 Type : Video Capture Pixel Format: 'RG10' Name : 10-bit Bayer RGRG/GBGB Size: Discrete 1920x1080 Interval: Discrete 0.020s (50.000 fps)
This anyway comes from the device tree, where I’ve set up:
mode_type = "bayer"; pixel_phase = "rggb";
I’ve set it up like this because I see no way of specifying it is RAW data to the device tree. The Sensor Driver Programming Guide details that the possible values for mode_type are
I guess the single image capturing is working due to the ISP bypassing. Should I do something similar with the video streaming pipeline? If so, how? Or do I need to modify the device tree?