Reading frames from Jetson Nano CSI camera with V4L2

Hello everyone,
I’m currently working on a very light (as light as possible) jetson nano system build using buildroot.

In order to keep things light, I want to read frames from an IMX219 camera using v4l2 (no egl / mesa / x11 / wayland).
From what I’ve read in the documentation, this should be supported (as long as I don’t use the ISP).

So far, the only working examples I’ve seen use libargus and some eglDisplay.
Every suggestion on using v4l2 returns either a green or black frame as a result.

Does anyone have any working example with plain v4l2 and a csi camera?

Thanks in advance.

Best Regards,
Juan Pablo

Probably the problem is the video format. IMX219 is a bayer sensor and the driver only provides bayer formats.
Debayering can be done with ISP, but you will need Argus lib, such as plugin nvarguscamerasrc with gstreamer.
You can disable auto-exposure and other auto adjustments from Argus and set your camera from v4l controls.

What’s kind of case can’t handle by argus to use v4l2?
You should choose YUV sensor instead of bayer sensor for using v4l2 API.
Using you still can control the gain/exposure manually.

Hi everyone,
@Honey_Patouceul, I’m wondering how a bayer pattern would give me a completely (100%) black or green image.
I’ve picked out incorrect image formats in the past and they have resulted in weird images, but I could tell the sensor was working.

I don’t want to include gstreamer since that would increase the rootfs image size by a fair amount.
I would consider Argus if it did not depend on OpenGL (which seems to require mesa3d + X11).
I can’t rely on Argus to do any work (on the final system) since it is not included.

@ShaneCCC, just picking a YUV sensor is not an acceptable solution, since I require to set up more complex things in the future (like the DS90UB954-Q1 deserializer) and will be writing drivers for those things.
I’ve done a fair amount of work on different platforms and reading frames should be an easy call to v4l2-ctl.

I have already checked the commands from this 3 posts (with the parameters corresponding to my sensor) with no success:

I can also confirm that the gstreamer nvargussrc thing works (so it is not a sensor connection issue) but the v4l2-ctl commands do nothing (on Jetpack 4.1).

I’m currently reading the “Sensor Software Driver Programming Guide” to see if I can find something there but it doesn’t completely match what I see on the sources for L4T version 32.3.1.

I would appreciate if someone could suggest either C code or a series of commands (not involving gstreamer or argus) that gives me something that looks like a working raw image.
I also appreciate comments on parameters that could be wrongly configured and how to check / fix them (eg: device tree source files).

Best Regards,
Juan Pablo

If you don’t want to use argus you need debayer by yourself for the raw sensor.
You can check below c code for the v4l2 API and again you need to debayer the data from this capture.

https://linuxtv.org/downloads/v4l-dvb-apis/uapi/v4l/capture.c.html

If you get black frames, isn’t your gain or exposure set too low ?

Green frames is what you get if you treat black frames as YUV frames.

Hello again,
@ShaneCCC, thanks for the suggestion, but I’ve already tried a few variations of that code.

Just to mention some:

@phdm

I’ve tried changing the gain and exposure but I might not be getting them right, so I will check some more.
And a lucky mistake tells me you are probably right on the green frames being yuv converted black ones.

I’ve also checked Section 31 of the “Tegra X1 Technical Reference Manual” for some ideas.
If someone has working code that allows to enable / disable testing patterns and checking the vi configuration, that should be a great help.

Best Regards,
Juan Pablo

I think you can have argus to check the gain/exposure/frame rate setting and using v4l2-ctl to set them to the same configure and capture the raw data by v4l2-ctl them to confirm it.
You may be need v4l2-ctl --stream-skip=20 … to make sure the frame take effect.

Hi everyone,
So far I have tried a few new things.

First, I enabled the vi tpg module in order to check v4l2 usage and debayering. This worked fine.
Second, I modified the imx219 driver to output the sensor test pattern. This also worked fine.
After that, I decided to do the same (in a few different ways) with the imx219 sensor image. This was not fine.

@ShaneCCC, I tried your suggestions as well as a few other things. It doesn’t really matter what I do, I always end up with a black frame in the end.

A few observations:

Best Regards,
Juan Pablo

Could you attached the raw image here to check.

@ShaneCCC,

The command I used for the captures had the following pattern
v4l2-ctl --set-fmt-video=width=3264,height=2464,pixelformat="RG10" --set-ctrl bypass_mode=0 --device=/dev/video0 --stream-skip=100 --stream-count=1 --stream-mmap --stream-to=file.raw

I’ve attached a capture of a single frame and five consecutive frames

Since I couldn’t attach the raws with the upload button, here are the google links for them:
https://drive.google.com/open?id=1ARDoxg25x4ExP5ZpAAhULtYUj0o52AuE
https://drive.google.com/open?id=1X6AtNISu5mG51yoe6VgyE7x-mXc_6yk3

I got the raw file by below command and read by 7yuv got below result.

And you raw as below. Looks like a litter dark without problem.

v4l2-ctl --set-fmt-video=width=3264,height=2464,pixelformat=“RG10” --set-ctrl bypass_mode=0 --device=/dev/video0 --stream-skip=10 --stream-count=1 --stream-mmap --stream-to=file.ra

@ShaneCCC, after a really big amount of tests, I managed to get a decent result with the following command sequence

v4l2-ctl -d /dev/video0 -c sensor_mode=2 --set-fmt-video=width=1920,height=1080,pixelformat=“RG10”
v4l2-ctl -d /dev/video0 -c gain=170 -c override_enable=1 -c bypass_mode=0 -c exposure=33333 -c frame_rate=30000000
v4l2-ctl -d /dev/video0 --stream-skip=100 --stream-count=1 --stream-mmap --stream-to=frame.raw

If I don’t follow that exact order, I usually see that some parameters are not set and have to run things a few times.

I also gave your suggestions a second try and realized that gstreamer was setting the override_enable flag.
With this flag set, I was able to confirm that the camera was working correctly (and a little out of focus).

I’ve attached the raw and (really bad) resulting image in case someone else finds it useful.

frame.log (4.0 MB)

Best Regards,
Juan Pablo.