Hello everyone,
I’m currently working on a very light (as light as possible) jetson nano system build using buildroot.
In order to keep things light, I want to read frames from an IMX219 camera using v4l2 (no egl / mesa / x11 / wayland).
From what I’ve read in the documentation, this should be supported (as long as I don’t use the ISP).
So far, the only working examples I’ve seen use libargus and some eglDisplay.
Every suggestion on using v4l2 returns either a green or black frame as a result.
Does anyone have any working example with plain v4l2 and a csi camera?
Probably the problem is the video format. IMX219 is a bayer sensor and the driver only provides bayer formats.
Debayering can be done with ISP, but you will need Argus lib, such as plugin nvarguscamerasrc with gstreamer.
You can disable auto-exposure and other auto adjustments from Argus and set your camera from v4l controls.
What’s kind of case can’t handle by argus to use v4l2?
You should choose YUV sensor instead of bayer sensor for using v4l2 API.
Using you still can control the gain/exposure manually.
Hi everyone, @Honey_Patouceul, I’m wondering how a bayer pattern would give me a completely (100%) black or green image.
I’ve picked out incorrect image formats in the past and they have resulted in weird images, but I could tell the sensor was working.
I don’t want to include gstreamer since that would increase the rootfs image size by a fair amount.
I would consider Argus if it did not depend on OpenGL (which seems to require mesa3d + X11).
I can’t rely on Argus to do any work (on the final system) since it is not included.
@ShaneCCC, just picking a YUV sensor is not an acceptable solution, since I require to set up more complex things in the future (like the DS90UB954-Q1 deserializer) and will be writing drivers for those things.
I’ve done a fair amount of work on different platforms and reading frames should be an easy call to v4l2-ctl.
I have already checked the commands from this 3 posts (with the parameters corresponding to my sensor) with no success:
I can also confirm that the gstreamer nvargussrc thing works (so it is not a sensor connection issue) but the v4l2-ctl commands do nothing (on Jetpack 4.1).
I’m currently reading the “Sensor Software Driver Programming Guide” to see if I can find something there but it doesn’t completely match what I see on the sources for L4T version 32.3.1.
I would appreciate if someone could suggest either C code or a series of commands (not involving gstreamer or argus) that gives me something that looks like a working raw image.
I also appreciate comments on parameters that could be wrongly configured and how to check / fix them (eg: device tree source files).
If you don’t want to use argus you need debayer by yourself for the raw sensor.
You can check below c code for the v4l2 API and again you need to debayer the data from this capture.
Hello again, @ShaneCCC, thanks for the suggestion, but I’ve already tried a few variations of that code.
Just to mention some:
Code I wrote based on the documentation from v4l2 which I have successfully used in the past on a Raspberry Pi as well as an imx6 processor (it calls v4lconvert for handling bayer conversion)
I’ve tried changing the gain and exposure but I might not be getting them right, so I will check some more.
And a lucky mistake tells me you are probably right on the green frames being yuv converted black ones.
I’ve also checked Section 31 of the “Tegra X1 Technical Reference Manual” for some ideas.
If someone has working code that allows to enable / disable testing patterns and checking the vi configuration, that should be a great help.
I think you can have argus to check the gain/exposure/frame rate setting and using v4l2-ctl to set them to the same configure and capture the raw data by v4l2-ctl them to confirm it.
You may be need v4l2-ctl --stream-skip=20 … to make sure the frame take effect.
Hi everyone,
So far I have tried a few new things.
First, I enabled the vi tpg module in order to check v4l2 usage and debayering. This worked fine.
Second, I modified the imx219 driver to output the sensor test pattern. This also worked fine.
After that, I decided to do the same (in a few different ways) with the imx219 sensor image. This was not fine.
@ShaneCCC, I tried your suggestions as well as a few other things. It doesn’t really matter what I do, I always end up with a black frame in the end.
A few observations:
I’ve been more successfull with 3264x2464 image size, other image sizes seem to fail because the original image format apparently remains 3264x2464.
The command I used for the captures had the following pattern v4l2-ctl --set-fmt-video=width=3264,height=2464,pixelformat="RG10" --set-ctrl bypass_mode=0 --device=/dev/video0 --stream-skip=100 --stream-count=1 --stream-mmap --stream-to=file.raw
I’ve attached a capture of a single frame and five consecutive frames
If I don’t follow that exact order, I usually see that some parameters are not set and have to run things a few times.
I also gave your suggestions a second try and realized that gstreamer was setting the override_enable flag.
With this flag set, I was able to confirm that the camera was working correctly (and a little out of focus).
I’ve attached the raw and (really bad) resulting image in case someone else finds it useful.