Getting Video 4 Linux (v4l2src) to work on Jetson Xavier NX for imx219

Hi,

I am trying to get v4l2loopback kernel module working on Jetson Xavier NX in order to be able to use single camera input for multiple processes.
As you might know, most of the tensorrt examples use gstreamer pipes for input. This applies for jetson-inference and tensorrt_demos repositories. (An example: https://github.com/jkjung-avt/tensorrt_demos/blob/master/utils/camera.py)

In the conventional way, I am able to view camera input using:

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=1920, height=1080, format=NV12, framerate=30/1' ! nvoverlaysink -e

However, nvarguscamerasrc plugin does not let us specify which /dev/videoX device to take into account. Therefore, I want to use v4l2src such as:

gst-launch-1.0 v4l2src device=/dev/video1 ! nvoverlaysink

This causes following error:

root@jetson-xavier-nx-devkit:~# gst-launch-1.0 v4l2src device=/dev/video1 ! nvoverlaysink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
../gstreamer-1.18.1/libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
ERROR: pipeline doesn't want to preroll.
Execution ended after 0:00:00.000517895
Setting pipeline to NULL ...
Freeing pipeline ...

Of all my experimentation with v4l2src plugin, I couldn’t manage to make it work.

Welcome — Jetson Linux<br/>Developer Guide 34.1 documentation documentation notes that I should be able to use following command:

gst-launch-1.0 v4l2src device="/dev/video0" ! \
  "video/x-raw, width=640, height=480, format=(string)YUY2" ! \
  xvimagesink -e

or by extension something like:

XDG_RUNTIME_DIR=/run/user/0 gst-launch-1.0 v4l2src device="/dev/video0" ! "video/x-raw, width=640, height=480, format=(string)YUY2" ! waylandsink -e

However, same error happens for this command as well.
Simply put, I want to access the imx219 camera using v4l2src plugin, not nvarguscamerasrc.

I am using a custom distro with wayland, Linux4Tegra R32.4.4 and JetPack 4.4.1 support. Being able to do this in Ubuntu distro that comes with the board is sufficient for me, I can integrate it to my custom distro.

Taking a look at imx219.c driver, I can see that NVIDIA introduced a couple of changes to the Jetson / Raspberry Pi IMX219 camera driver, I am thinking that might be the reason why I am not able to use v4l2src plugin.

If there is a workaround I can apply, or a simple gstreamer plugin to which I can specify /dev/videoX device, please let me know.

Any help is appreciated. Thanks in advance.

Only support RAW capture for v4l2 for bayer sensor. And the format in command “YUY2” is not support for imx219.

I didn’t quite get it. So how can I use v4l2src for imx219? Can you or your colleagues please suggest formats / video caps that will work with v4l2src plugin? Thanks

You can check the format form the v4l2-ctl --list-formats-ext to know the support pixel format.
And you can just have RAW data to show on, you need some software debayer element to process it like ISP do then send to sink element to show it.

Okay, Let me ask it from a different angle,
Here is a very basic command, which does not work:

gst-launch-1.0 videotestsrc ! nvoverlaysink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0: Internal data stream error.
Additional debug info:
../gstreamer-1.18.1/libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...

Why is that? Can’t I show videotestsrc on the display at all? If I should be able to, with which command I can do that ? Thanks

I get the following formats:

root@jetson-xavier-nx-devkit:~# v4l2-ctl -d /dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
	Type: Video Capture

	[0]: 'RG10' (10-bit Bayer RGRG/GBGB)
		Size: Discrete 3264x2464
			Interval: Discrete 0.048s (21.000 fps)
		Size: Discrete 3264x1848
			Interval: Discrete 0.036s (28.000 fps)
		Size: Discrete 1920x1080
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x720
			Interval: Discrete 0.017s (60.000 fps)
		Size: Discrete 1280x720
			Interval: Discrete 0.017s (60.000 fps)

That tell the nvoverlaysink not support videotestsrc default output format.
Maybe below pipeline working.

gst-launch-1.0 videotestsrc ! video/x-raw, format=RGBA ! nvoverlaysink

I got similar issue on a USB camera, i got it to work after adding videoconvert to my pipeline like :

gst-launch-1.0 videotestsrc ! videoconvert ! nvoverlaysink

Using my cam : (I didn’t specify the device Id)

gst-launch-1.0 v4l2src ! videoconvert ! xvimagesink

xvimagesink expects buffers from standard CPU (video/x-raw)
nvoverlaysink expects buffers from NVMM memory (video/x-raw(memory:NVMM), that is contiguous DMAable memory), not standard CPU memory.
nvvidconv allows to copy to/from between these.
So you would try:

gst-launch-1.0 -v videotestsrc ! nvvidconv ! nvoverlaysink

videotestsrc or v4l2src ouput into standard CPU (video/x-raw).
nvarguscamerasrc outputs into NVMM memory.

For getting back to your initial question, you cannot use v4l2src for IMX219. The reason is that it provides 10 bits bayer format. Gstreamer doesn’t support 10 bits bayer formats, so you cannot debayer with plugin bayer2rgb, it only works with 8 bits bayer formats.

nvarguscamerasrc is the most efficient solution for this camera on Jetson, as it will debayer with ISP and provide frames into NVMM memory, ready for HW encoding, processing, or display with nvoverlaysink.