CSI camera change RGB888 format to bayer format issue

The architecture is CAMER → FPGA → Jetson Xavier AGX

FPGA output CSI
RGB888 format setting …

mode_type = rgb;
pixel_phase = rgb888;
csi_pixel_bit_depth = 24;

video can be seen normally
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw, format=(string)BGRA, width=(int)400, height=(int)400 ! videoconvert ! xvimagesink -ev

But modify CSI CAMER to make it output bayer format

blayer format setting

mode_type = bayer;
pixel_phase = bggr;
csi_pixel_bit_depth = 10;

gst-launch-1.0 -vvv nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=(int)400, height=(int)400, framerate=30/1, format=(string)NV12’ ! nvvidconv flip-method=0 ! nvoverlaysink -e

show error …

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:725 No cameras available
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstNvOverlaySink-nvoverlaysink:nvoverlaysink-nvoverlaysink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.027721069
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Is there any error in this modification??

Looks like the devname in tegra-camera-platform{} didn’t match the driver name.
Have dump dtb to confirm it.

The devname need the same with v4l2-ctl --list-device like below, devname need the same with “ov5693 30-0036”

nvidia@nvidia-desktop:~$ v4l2-ctl --list-device
vi-output, ov5693 30-0036 (platform:15700000.vi:0):

root@user-desktop:/home/user# v4l2-ctl --list-device
vi-output, gen3 0-001e (platform:15c10000.vi:0):
/dev/video0

vi-output, gen3 0-002e (platform:15c10000.vi:2):
/dev/video1

video1 is ok…run rgb888 800x800 is ok

Check the devname in tegra-camera-platform{} in device tree.

my dts
tegra194-camera-gen3.dtsi (19.5 KB)

devname = “gen3_a 0-001e”;
and
devname = “gen3_b 0-002e”;

Modify it to the devname to “gen3 0-001e” to match the driver report.(remove “_a” and “_b”)

tegra194-camera-gen3.dtsi (19.5 KB)

Other error messages appear

gst-launch-1.0 -vvv nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=(int)400, height=(int)400, framerate=30/1, format=(string)NV12’ ! nvvidconv flip-method=0 ! nvoverlaysink -e

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstNvOverlaySink-nvoverlaysink:nvoverlaysink-nvoverlaysink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected…
GST_ARGUS: Available Sensor modes :
Caught SIGSEGV
#0 0x0000007f979a3e28 in __GI___poll (fds=0x55ae922530, nfds=548005417864, timeout=) at …/sysdeps/unix/sysv/linux/poll.c:41
#1 0x0000007f97ab0f58 in () at /usr/lib/aarch64-linux-gnu/libglib-2.0.so.0
#2 0x00000055ae8173f0 in ()
Spinning. Please run ‘gdb gst-launch-1.0 9554’ to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.

Remove below from tegra-camera-platform{}

			module1 {
				badge = "gen3_top_i2c1";
				position = "centerleft";
				orientation = "0";
				drivernode0 {
					pcl_id = "v4l2_sensor";       // Declare PCL support driver(classically known as guid)
					devname = "gen3 0-002e";  // Driver's v4l2 device name
					/* Declare the device-tree hierarchy to driver instance */
					proc-device-tree = "/proc/device-tree/i2c@3160000/gen3@2e";

				};

Also confirm by below command to confirm capture.

v4l2-ctl -c bypass_mode=0 --stream-mmap  -d /dev/video0

Executing v4l2-ctl -c bypass_mode=0 --stream-mmap -d /dev/video0 crashes the system
0217_bayer_err.txt (6.7 KB)

Edit to keep only one channel
tegra194-camera-gen3.dtsi (3.6 KB)

Looks like unable capture validate data from sensor.
Have check the trace log if able find any clue.

https://elinux.org/Jetson/l4t/Camera_BringUp

echo 1 > /sys/kernel/debug/tracing/tracing_on
echo 30720 > /sys/kernel/debug/tracing/buffer_size_kb
echo 1 > /sys/kernel/debug/tracing/events/tegra_rtcpu/enable
echo 1 > /sys/kernel/debug/tracing/events/freertos/enable
echo 2 > /sys/kernel/debug/camrtc/log-level
echo 1 > /sys/kernel/debug/tracing/events/camera_common/enable
echo > /sys/kernel/debug/tracing/trace
cat /sys/kernel/debug/tracing/trace

run command
gst-launch-1.0 -vvv nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=(int)400, height=(int)400, framerate=30/1, format=(string)NV12’ ! nvvidconv flip-method=0 ! nvoverlaysink -e

0217_bayer_trace.txt (14.2 KB)

The trace log tell didn’t receive any validate data from MIPI bus.
You may need to probe the sensor output signal to confirm it.

now use v4l2-ctl can be received data

ideo0user-desktop:/home/user# v4l2-ctl -c bypass_mode=0 --stream-mmap -d /dev/vi
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.09 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.04 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.13 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.09 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.07 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.06 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.05 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.04 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.04 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.06 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.06 fps

But using gst-launch-1.0 shows No cameras available

gst-launch-1.0 -vvv nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=(int)400, height=(int)400, framerate=30/1, format=(string)NV12’ ! nvvidconv flip-method=0 ! nvoverlaysink -e

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:725 No cameras available
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstNvOverlaySink-nvoverlaysink:nvoverlaysink-nvoverlaysink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)400, height=(int)400, format=(string)NV12, framerate=(fraction)30/1
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.028445393
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

dst setting

mode_type = bayer;
pixel_phase = bggr;
csi_pixel_bit_depth = 10;

root@user-desktop:/home/user# v4l2-ctl -d /dev/video0 --all
Driver Info (not using libv4l2):
Driver name : tegra-video
Card type : vi-output, gen3 0-001e
Bus info : platform:15c10000.vi:0
Driver version: 4.9.253
Capabilities : 0x84200001
Video Capture
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x04200001
Video Capture
Streaming
Extended Pix Format
Priority: 2
Video input : 0 (Camera 0: ok)
Format Video Capture:
Width/Height : 400/400
Pixel Format : ‘BG10’
Field : None
Bytes per Line : 800
Size Image : 320000
Colorspace : sRGB
Transfer Function : Default (maps to sRGB)
YCbCr/HSV Encoding: Default (maps to ITU-R 601)
Quantization : Default (maps to Full Range)
Flags :

Camera Controls

                 group_hold 0x009a2003 (bool)   : default=0 value=0 flags=execute-on-write
                 hdr_enable 0x009a2004 (intmenu): min=0 max=1 default=0 value=0
                    fuse_id 0x009a2007 (str)    : min=0 max=12 step=2 value='303132333435' flags=read-only, has-payload
                       gain 0x009a2009 (int64)  : min=0 max=0 step=0 default=0 value=0 flags=slider
                   exposure 0x009a200a (int64)  : min=0 max=0 step=0 default=0 value=30 flags=slider
                 frame_rate 0x009a200b (int64)  : min=0 max=0 step=0 default=0 value=15000000 flags=slider
             exposure_short 0x009a200c (int64)  : min=0 max=0 step=0 default=0 value=30 flags=slider
       sensor_configuration 0x009a2032 (u32)    : min=0 max=0 step=0 default=0 flags=read-only, volatile, has-payload
     sensor_mode_i2c_packet 0x009a2033 (u32)    : min=0 max=0 step=0 default=0 flags=read-only, volatile, has-payload
  sensor_control_i2c_packet 0x009a2034 (u32)    : min=0 max=0 step=0 default=0 flags=read-only, volatile, has-payload
                bypass_mode 0x009a2064 (intmenu): min=0 max=1 default=0 value=0
            override_enable 0x009a2065 (intmenu): min=0 max=1 default=0 value=0
               height_align 0x009a2066 (int)    : min=1 max=16 step=1 default=1 value=1
                 size_align 0x009a2067 (intmenu): min=0 max=2 default=0 value=0
           write_isp_format 0x009a2068 (int)    : min=1 max=1 step=1 default=1 value=1
   sensor_signal_properties 0x009a2069 (u32)    : min=0 max=0 step=0 default=0 flags=read-only, has-payload
    sensor_image_properties 0x009a206a (u32)    : min=0 max=0 step=0 default=0 flags=read-only, has-payload
  sensor_control_properties 0x009a206b (u32)    : min=0 max=0 step=0 default=0 flags=read-only, has-payload
          sensor_dv_timings 0x009a206c (u32)    : min=0 max=0 step=0 default=0 flags=read-only, has-payload
           low_latency_mode 0x009a206d (bool)   : default=0 value=0
           preferred_stride 0x009a206e (int)    : min=0 max=65535 step=1 default=0 value=0
               sensor_modes 0x009a2082 (int)    : min=0 max=30 step=1 default=30 value=1 flags=read-only

root@user-desktop:/home/user#

No cameras tell the something incorrect in tegra-camera-platform{}

Measure MIPI signals …

|||mode_type = bayer;|
|||||||pixel_phase = bggr;|
|||||||csi_pixel_bit_depth = 10;|
How to arrange MIPI data sent by FPGA ???

dts file
tegra194-camera-gen3.dtsi (3.7 KB)

use v4l2-ctl command …
v4l2-ctl -d /dev/video0 --stream-mmap --stream-count=1 --stream-skip=3 --stream-to=test.raw

Data is received but appears to be offset or missing
bggr.bmp (468.8 KB)

021818_v4l2-ctl_trace.log (28.5 KB)

test gst-launch-1.0 command …Unable to display image

gst-launch-1.0 -vvv nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=(int)400, height=(int)400, framerate=30/1, format=(string)NV12’ ! nvvidconv flip-method=0 ! nvoverlaysink -e

021818_gst_trace.log (581.9 KB)

show err
ERROR: from element /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0: TIMEOUT
Additional debug info:
Argus Error Status
EOS on shutdown enabled – waiting for EOS after Error
Waiting for EOS…

Any suggestions please ? ?

Does this FPGA need long time initial to output data to MIPI?
Try to reduce the initial time to try.

1.Does this FPGA need long time initial to output data to MIPI?
… After changing the format to RGB888, it can be displayed normally,
There should be no problem with the FPGA initial time being too long

2.How to convert Bayer 10 raw data to bmp ??
The general conversion software is similar to bayer_bggr8 or bayer_bggr16le …

Do you mean tools to show the Bayer format? I used 7yuv.
Or Gstreamer element bayer2rgb.

use v4l2-ctl command…
v4l2-ctl -d /dev/video0 --stream-mmap --stream-count=1 --stream-skip=3 --stream-to=test.raw

Need to convert to BMP file
The format is
mode_type = bayer;
pixel_phase = bggr;
csi_pixel_bit_depth = 10;
Can’t find a suitable conversion software. Do you have any suggestions??

So change the format and add the corresponding program
mode_type = bayer;
pixel_phase = bggr;
csi_pixel_bit_depth = 8;

v4l2-ctl -d /dev/video0 --list-formats-ext

ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: ‘BA81’
Name : 8-bit Bayer BGBG/GRGR
Size: Discrete 400x400
Interval: Discrete 0.033s (30.000 fps)

v4l2-ctl -d /dev/video0
–set-fmt-video=width=400,height=400,pixelformat=BA81
–set-ctrl bypass_mode=0 --stream-mmap --stream-count=1
–stream-skip=0 --stream-to=bg8.raw

bg8.raw (156.3 KB)

ffplay -f rawvideo -video_size 400*400 -pixel_format bayer_bggr8 -i bg8.raw

Screenshot from 2022-02-25 13-37-40.bmp (2.3 MB)

The FPGA generates a pattern so that the data should be
Except for the first LINE, everything else is red
But the actual received data seems to have lost data

image
Originally should be 400x400
but each line will have a different number
Have you encountered similar problems??

Could you generate 640x480 to verify it again.

Thanks