How to change capture format (from YV12 to YUYV) for CSI video source

Hi,
[b]
I am trying to capture data from a YUV image sensor connected to the Jetson TX2. I am writing my own driver.

I am able to capture an image using:[/b]

nvidia@tegra-ubuntu:~$ v4l2-ctl --set-fmt-video=width=1920,height=1080,pixelformat=UYVY --set-ctrl bypass_mode=0 --stream-mmap --stream-count=3 --stream-to=imx334.raw -d /dev/video0

[b]
I can look at the image output in an image viewer and it looks good! correct resolution.

However, I want to stream this data… But, when I run the following command (below) I get the following error message.[/b]

nvidia@tegra-ubuntu:~$ gst-launch-1.0 -v v4l2src ! “video/x-raw” ! videoconvert ! autovideosink
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Device ‘/dev/video0’ cannot capture in the specified format
Additional debug info:
gstv4l2object.c(3482): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Tried to capture in YU12, but device returned format YUYV
Execution ended after 0:00:00.001255315
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …
[b]
How can I get the v4l2src to capture with expected data format of YUYV?

gst-device-monitor-1.0 returns this portion for the camera module input:
[/b]

Device found:

name  : vi-output, imx324 2-001a
class : Video/Source
caps  : video/x-raw, format=(string)I420, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 16, 16384 ], height=(int)[ 32, 32768 ], interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1;
        video/x-raw, format=(string)YV12, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 16, 16384 ], height=(int)[ 32, 32768 ], interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1;
        video/x-raw, format=(string)BGR, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 16, 16384 ], height=(int)[ 32, 32768 ], interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB;
        video/x-raw, format=(string)RGB, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 16, 16384 ], height=(int)[ 32, 32768 ], interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB;
properties:
	udev-probed = true
	device.bus_path = platform-15700000.vi
	sysfs.path = /sys/devices/13e10000.host1x/15700000.vi/video4linux/video0
	device.subsystem = video4linux
	device.product.name = "vi-output\,\ imx324\ 2-001a"
	device.capabilities = :capture:
	device.api = v4l2
	device.path = /dev/video0
	v4l2.device.driver = tegra-video
	v4l2.device.card = "vi-output\,\ imx324\ 2-001a"
	v4l2.device.bus_info = platform:15700000.vi:0
	v4l2.device.version = 263206 (0x00040426)
	v4l2.device.capabilities = 2233466881 (0x85200001)
	v4l2.device.device_caps = 85983233 (0x05200001)

nvidia@tegra-ubuntu:~$ gst-device-monitor-1.0
[b]
Where is device monitor finding these device capabilities? I can’t seem to find where they are defined in the rootfs…
I find it especially confusing that it is expecting YV12 when I am stating in the device tree and driver that the media format is YUV 2x8.

Thanks for any help you can recommend me.
[/b]
Regards,
Daniel

hello dmcginley3,

you might check the device tree settings of these sensor capabilities.
for example, you should check the kernel source as below path,
thanks

$l4t-r32.2/kernel_src/hardware/nvidia/platform/t18x/common/kernel-dts/

Hi Jerry,

Thanks for the suggestion. If I understand your suggestion, you want me to check the device tree files? I believe the device tree to be configured correctly because I can properly capture images. The problem arises when I try to stream the video data. Am I wrong in assuming that?

I am focusing on how to display UYVY data with gst-streamer-1.0 but I am having trouble setting up the command to do so.

Any other suggestions could be helpful.

Regards,
Daniel

hello dmcginley3,

you should ensure device tree property settings to reflect your sensor output formats.
then, you might refer to Applications Using GStreamer with V4L2 Source Plugin session for the commands to launch camera sensor.
adding “nvvidconv” plug-in to convert the sources to your expect output results.
thanks