CSI Frames malformed when using jetson NX

As the title suggests I am having issues with receiving frames with a Jetson NX The images I receive seem to have a skew to them which repeats periodically. It looks as if the lines are wrapping at the end. I do not have this issue when using the same camera and driver with the Jetson Nano. The Jetson Nano seems to append a small buffer of black pixels to the right edge of the image instead.

Here is are some images showing the issue with capture:

RGB


Before Demosaicing

I was wondering if this may be due to line padding settings located in kernel/nvidia/include/media/vi2_registers.h. I was under the impression that the jetson Xavier had a different phy, and would be configured differently. Is there a different file I should be looking at?

hello benm12y6,

may I know which sensor module you’re working with?
may I also know is this sensor in the camera list, which supported by Jetson Camera Partners on the Jetson platform.
thanks

hello JerryChang,

The module isn’t on that list as it was made in house. I am confident that the sensor is MIPI compliant, and I do get correct images from a Jetson Nano, however when I use the same driver with the Jetson Xavier NX I get the issues in my original post.

If you are capturing via Argus this shouldn’t be an issue. If you are capturing raws using v4l2-ctl be sure that you set an appropriate stride length. At D3 we see this with our imx390 camera modules. The difference between Nano and NX doesn’t surprise me. You should also consult the TRMs for each SOM as the significant bits for your data very likely are in different positions within each 16 bit container.

For D3 imx390 cameras we capture raw data using the following example. Note that the stride length is specified in bytes, not pixels.

v4l2-ctl -cpreferred_stride=4096 --stream-mmap --stream-count=1 --stream-skip=16 --stream-to=imx390.raw

The image resolution for that camera is 1936x1100. When we view the raw data we use 2048x1100.

Regards,
Greg

Thanks for pointing this out! Could you please explain to me where the stride of 4096 comes from? It looks like twice the horizontal resolution you are using, which I guess means the camera outputs two bytes per pixel?

Also would you mind sharing your Argus streaming command?

To be completely honest the 4096 value was a bit of an educated guess. The image sensor in the example is a Sony imx390. That sensor is a raw12 sensor with a native image resolution of 1936x1100. At 12 bits per pixel it eventually gets unpacked into 16 bits per pixel. Over CSI it’s really 12 bits per pixel but the VI puts it in memory as 16 bits. Anyhow, the 4096 value comes from rounding up to a nice power of two (2048) and then doubling because the stride_length unit is bytes.

For argus streaming I almost always use argus_camera that comes with the Jetson Multimedia API examples. It’s a great tool for quick verification and the full source code is included. The code is included in the nvidia-l4t-jetson-multimedia-api package.

Greg

Thanks for the help! The stride setting seemed to fix my problem (I also used an educated guess).

Glad to hear it! Happy hacking!