Can you confirm again whether it is ok to use a different format in the device tree and driver settings as the actual incoming data format from the stream?
I tried the serializer RGB888 test pattern again. It is working if the device tree and driver settings is set to RGB888. However, if I set the device tree and driver settings to RAW12, and set the active image width to be 3840 (2x actual width 1920), I reproduced the short frame error. It seems the CSI/VI channel multiplexers require the data to match the settings in the device tree and sensor driver.
It seems that the main problem was the data type in the CSI packet header. We changed the data type to match the RAW10 data type that we are using in the driver and device tree settings.
We were able to get the data, with the image width at 3840 and 9830400 bytes per image (4 bpp x 1920 x 1280). It seems to indicate that every 10 bit image data is packed in the 16-bit memory buffer data in TX2. We just need to decode them to allocate the actual 12-bit/8-bit data for each channel of our sensor.
We just need to clarify the byte order as written in the 16-bit memory buffer of TX2. In the TX2 TRM, it only shows an example for T_R16_I (see attached picture). Does it follow the same memory format for the other compatible RAW10 memory formats, such as T_R32 or T_L32_F? Also, is the RAW10 bit data first stored in a CSI/VI memory buffer before it is written to DRAM at little-endian byte order? Or is written directly to DRAM with the first bit considered as the LSB and therefore written as little-endian?
for Jetson-TX2,
we used T_R16_I data type for RAW12/14 CSI capture, which is S1.14 fixed point format.
there is no support for data types beyond RAW14 for Jetson-Tx2.
thanks
you may refer to TRM, please check [Chapter-27: Video Input (VI)] for the raw memory formats.
please also note that, the pixel format in memory is T_R16_I for TX2 now.
thanks