I would like to take the raw image from a rapsberry pi hq IMX477 camera. I am able to take images through gstreamer from the pipeline, but I don’t believe these are the true raw images, but I do know the camera is working.
Can you please share the GStreamer pipeline that you are using to capture the images? In general if you use GStreamer to capture RAW images you would need to use v4l2src instead of nvarguscamerasrc (which gives you YUV images instead).
And that did produce a .raw file but have been unable to read it/extract different RBG channels to verify it has the data I need. Is there a guide for this particular image format coming from this camera?
I am able to read the file in matlab with fread but the number of cells in the array does not match the total number of pixels (and also not sure how to extract the 3 channels from each cell value). I imagine there might be a data buffer or something as well, but not entirely sure what that is.
I see, there is a specific pixel data format in place by the VI subsystem that is typical in RAW buffers captured in NVIDIA SOMs. It is probable that you need to take into account this specific pixel data format in your matlab script. There are some related topics about this in the forum already, for example:
If you require to inspect the image visually you could try using vooya.
Also, I have never use it personally but I know there is this tool (nvraw tool) provided by the NVIDIA BSP to strip the image buffer from the additional data and get a true RAW buffer. Some documentation here.
Thank you for the suggestions. I did try the nvraw tool and this may be what I want, as it additionally allows me to manually set the exposure time and gain value. However it is not showing the desired pixel size as an available mode. I should be able to run 4032x3040 on the IMX477, but the largest mode I am seeing is 3840x2160.
Any idea on how to manually set the pixel area or why I am not seeing the full available sensor format?
I am not entirely sure why the pipeline that you provided gives an image of 4032x3040 as you claim. I do not see any upscaling element in the pipeline description for example. Also as far as I know the maximum resolution available in the L4T driver is 3840x2160. Can you please provide the output of the gst-launch command when the verbose (add -v at the end of the command) option is enabled to check the caps negotiation?
Can you please provide more details about how did you check the size of the frame? One trick that I typically use is to capture a RAW buffer with something like: