Hi everyone,
@Honey_Patouceul, I’m wondering how a bayer pattern would give me a completely (100%) black or green image.
I’ve picked out incorrect image formats in the past and they have resulted in weird images, but I could tell the sensor was working.
I don’t want to include gstreamer since that would increase the rootfs image size by a fair amount.
I would consider Argus if it did not depend on OpenGL (which seems to require mesa3d + X11).
I can’t rely on Argus to do any work (on the final system) since it is not included.
@ShaneCCC, just picking a YUV sensor is not an acceptable solution, since I require to set up more complex things in the future (like the DS90UB954-Q1 deserializer) and will be writing drivers for those things.
I’ve done a fair amount of work on different platforms and reading frames should be an easy call to v4l2-ctl.
I have already checked the commands from this 3 posts (with the parameters corresponding to my sensor) with no success:
- How to specify RAW format in device tree - #13 by euskadi
- v4l2src with gstreamer not working - #15 by euskadi
- support for grayscale sensors is missing
I can also confirm that the gstreamer nvargussrc thing works (so it is not a sensor connection issue) but the v4l2-ctl commands do nothing (on Jetpack 4.1).
I’m currently reading the “Sensor Software Driver Programming Guide” to see if I can find something there but it doesn’t completely match what I see on the sources for L4T version 32.3.1.
I would appreciate if someone could suggest either C code or a series of commands (not involving gstreamer or argus) that gives me something that looks like a working raw image.
I also appreciate comments on parameters that could be wrongly configured and how to check / fix them (eg: device tree source files).
Best Regards,
Juan Pablo