Raw image with RaspberryPi HQ IMX477 and Jetson Nano

I would like to take the raw image from a rapsberry pi hq IMX477 camera. I am able to take images through gstreamer from the pipeline, but I don’t believe these are the true raw images, but I do know the camera is working.

I have the same setup as the OP here (jetpack 4.6.1, updated to L4T: 32.7.3): Proper way of capturing RAW 10 or 12-bit images from IMX477 on Jetson Nano with JetPack 4.6

And believe I am trying to achieve the same thing.

However when looking at the solution I do not have the files requesting a change:
— a/drivers/media/i2c/imx477_mode_tbls.h
+++ b/drivers/media/i2c/imx477_mode_tbls.h

I looked around for these files but could not find them. Also, I am wondering if there is a more up to date solution since January 2022.



Can you please share the GStreamer pipeline that you are using to capture the images? In general if you use GStreamer to capture RAW images you would need to use v4l2src instead of nvarguscamerasrc (which gives you YUV images instead).

Have you tried capturing with v4l2-ctl directly?:

v4l2-ctl --set-fmt-video=width=3840,height=2160,pixelformat=RG10 --stream-mmap --stream-count=1 -d /dev/video0 --stream-to=IMX477.raw

Please note that if you attempt to use GStreamer you most likely need to patch the v4l2src element to be able to capture 10 or 12 bits format. You can find some instructions on how to setup this here: Compile GStreamer on Jetson TX1 and TX2 | RidgeRun - RidgeRun Developer Connection

Jafet Chaves,
Embedded SW Engineer at RidgeRun
Contact us: support@ridgerun.com
Developers wiki: https://developer.ridgerun.com/
Website: www.ridgerun.com

I used this command:

v4l2-ctl --set-fmt-video=width=4032,height=3040,pixelformat=RG10 --stream-mmap --stream-count=1 -d /dev/video0 --stream-to=test.raw

And that did produce a .raw file but have been unable to read it/extract different RBG channels to verify it has the data I need. Is there a guide for this particular image format coming from this camera?

I am able to read the file in matlab with fread but the number of cells in the array does not match the total number of pixels (and also not sure how to extract the 3 channels from each cell value). I imagine there might be a data buffer or something as well, but not entirely sure what that is.

I see, there is a specific pixel data format in place by the VI subsystem that is typical in RAW buffers captured in NVIDIA SOMs. It is probable that you need to take into account this specific pixel data format in your matlab script. There are some related topics about this in the forum already, for example:

If you require to inspect the image visually you could try using vooya.

Also, I have never use it personally but I know there is this tool (nvraw tool) provided by the NVIDIA BSP to strip the image buffer from the additional data and get a true RAW buffer. Some documentation here.

Thank you for the suggestions. I did try the nvraw tool and this may be what I want, as it additionally allows me to manually set the exposure time and gain value. However it is not showing the desired pixel size as an available mode. I should be able to run 4032x3040 on the IMX477, but the largest mode I am seeing is 3840x2160.

Any idea on how to manually set the pixel area or why I am not seeing the full available sensor format?

It could be that your device driver lacks the support for the full array (4056x3040) mode. You can check the supported modes in your driver with the following command:

v4l2-ctl -d /dev/video0 --list-formats-ext

So I guess I didn’t see as many available modes as I thought. When I run

v4l2-ctl -d /dev/video0 --list-formats-ext

you’re right I only see 2 modes and the highest available resolution is 3840x2160.

However, when I run

gst-launch-1.0 -e nvarguscamerasrc num-buffers=1 sensor-id=0 ! “video/x-raw(memory:NVMM),width=4032,height=3040,framerate=30/1” ! nvjpegenc ! multifilesink location=test.jpeg

the result is a 4032x3040 image when I check the file properties. Is this just artificially upscaling the resolution and not actually a 4032x3040 image? Or is there a difference in the utilities?

If it is a driver issue, is there an easy driver replacement to get the full resolution? Is 3840x2160 the maximum supported natively by the nano?

I am not entirely sure why the pipeline that you provided gives an image of 4032x3040 as you claim. I do not see any upscaling element in the pipeline description for example. Also as far as I know the maximum resolution available in the L4T driver is 3840x2160. Can you please provide the output of the gst-launch command when the verbose (add -v at the end of the command) option is enabled to check the caps negotiation?

Can you please provide more details about how did you check the size of the frame? One trick that I typically use is to capture a RAW buffer with something like:

v4l2-ctl --set-fmt-video=width=3840,height=2160,pixelformat=RG10 --stream-mmap --stream-count=1 -d /dev/video0 --stream-to=IMX477.raw

Then check the file size with the ls command for example. What is the output in your case?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.