While implementing the driver to add support into JetPack 32.1 for a custom camera sensor over MIPI CSI2, I came across a question:
Is there any limitation or constraint to the width and the height of video resolution that could prevent the L4T to accept incoming frames on Jetson AGX ?
If a camera is configured to send frames of size 135x121, is it ok for the Jetson AND the L4T drivers ?
I failed to implement the support for frames of 3120x876 RAW14, because the last 16 pixels of each line were always garbage while the camera sensor actually sent correct pixels values for the last 16 pixels of each line. Having a FPGA in the CSI2 path between the camera and the Jetson AGX, we could add 16 dummy pixels to each line and create frames of 3136x876. That worked. But I have no clue why. Alignment to 32 pixels (of 2 bytes) for width ?
I need to implement the support for frame of size 600x600 and 600x800. will this be possible ?
If I take the 32 pixels alignment rule for the width, these resolution will not be accepted.
there’s sensor mode limitation that it only support image width/height in even numbers.
so, you might consider to drop 1-line and 1-pixel of your 135x121 sensor output frames.
v4l2 standard controls should follow the VI requirements, which VI it only requires atom (64-Byte) aligned.
however, you might enable the preferred stride v4l2 CID controls for your stride adjustments.
you may enable preferred_stride controls via v4l2 standard controls,
$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=2592,height=1944,pixelformat=RG10 --set-ctrl bypass_mode=0 --set-ctrl preferred_stride=256 --stream-mmap --stream-count=10
since this change did not included into l4t-r32.1 code-line, you may upgrade your release to l4t-r32.4 for the
Thanks for your quick response.
To make it clear to me for the 135x121 mode:
-> odd width or odd height are not supported. I shall ensure that the width AND the height are even numbers ?
-> Assuming 134x120 RAW14 (2 bytes-per-pixel, using even number for width and height), this can work only if I set the preferred_stride to 134 for example (because the default 64-bytes alignment does not work for 134 pixels). Correct ?
that’s correct; VI engine did not support image resolution in odd numbers, you please slightly crop one pixel for verification.
I used the proposed solution of preferred_stride.
The goal is to capture 600x800 frames RAW10, while the size in bytes of lines (600*2) is not a multiple of 64. So I configured preferred_stride to 1216. I can now capture all pixels from sensor, but the returned V4L frames actually have lines of 1216 bytes (having 8 trailing black pixels).
Did I do something wrong, or is it expected behavior ?
If it is the expected behavior, how can I get rid of these trailing 8 black pixels knowing that the V4L cropping function is not implemented inside NVIDIA (channel.c) driver (VIDIOC_S_CROP) ?
(about V4L2's ioctl : VIDIOC_CROPCAP VIDIOC_S_CROP)
Thank your for your help,
yes, please have implementation to crop those trailing black pixels.
you may initial another new forum discussion thread for further supports.