Hi,
I have developed a driver for a camera that supports 3 modes, GREY, Y14, and Y16. The camera outputs in 1680x1050 @ 90,45, or 30. The issue is when I have use_sensor_mode_id = false
and I use a command like v4l2-ctl -d /dev/video1 --set-fmt-video=width=1680,height=1050,pixelformat='Y14 ' --stream-mmap --stream-to=frame.bin
it is unable to select the correct Y14 mode, it always selects mode 0.
If I use use_sensor_mode_id = true
and add --set-ctrl=sensor_mode=1
then it works as expected. But my understanding from Sensor Software Driver Programming — Jetson Linux<br/>Developer Guide 34.1 documentation when use_sensor_mode_id If false, it uses default mode selection logic, which selects a mode based on resolution, color format, and frame rate.
Here are some of my relevant code snippets:
You can see the modes defined in my camera_common_frmfmt table.
+static const struct camera_common_frmfmt dominite_frmfmt[] = {
+ //{size, framerates, num_framerates, hdr_en, mode}
+ {{1680,1050}, dominite_m1_fps, 3, 0, DOMINITE_MODE_1680X1050_08_2L},
+ {{1680,1050}, dominite_m1_fps, 3, 0, DOMINITE_MODE_1680X1050_14_4L},
+ {{1680,1050}, dominite_m1_fps, 3, 0, DOMINITE_MODE_1680X1050_16_4L},
In the device tree there are 3 modes defined
mode0 {
....
+ active_w = "1680";
+ active_h = "1050";
+ mode_type = "raw";
+ pixel_phase = "y";
+ csi_pixel_bit_depth = "8";
...
}
mode1 {
...
+ active_w = "1680";
+ active_h = "1050";
+ dynamic_pixel_bit_depth="16";
+ mode_type="gray14";
+ pixel_phase = "y";
+ csi_pixel_bit_depth = "14";
...
}
mode2 {
...
+ active_w = "1680";
+ active_h = "1050";
+ dynamic_pixel_bit_depth="16";
+ mode_type="gray16";
+ pixel_phase = "y";
+ csi_pixel_bit_depth = "16";
...
}
and listing formats in v4l2 gives
v4l2-ctl -d 1 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'GREY' (8-bit Greyscale)
Size: Discrete 1680x1050
Interval: Discrete 0.011s (90.000 fps)
Interval: Discrete 0.022s (45.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1680x1050
Interval: Discrete 0.011s (90.000 fps)
Interval: Discrete 0.022s (45.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1680x1050
Interval: Discrete 0.011s (90.000 fps)
Interval: Discrete 0.022s (45.000 fps)
Interval: Discrete 0.033s (30.000 fps)
[1]: 'Y14 ' (14-bit Greyscale)
Size: Discrete 1680x1050
Interval: Discrete 0.011s (90.000 fps)
Interval: Discrete 0.022s (45.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1680x1050
Interval: Discrete 0.011s (90.000 fps)
Interval: Discrete 0.022s (45.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1680x1050
Interval: Discrete 0.011s (90.000 fps)
Interval: Discrete 0.022s (45.000 fps)
Interval: Discrete 0.033s (30.000 fps)
[2]: 'Y16 ' (16-bit Greyscale)
Size: Discrete 1680x1050
Interval: Discrete 0.011s (90.000 fps)
Interval: Discrete 0.022s (45.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1680x1050
Interval: Discrete 0.011s (90.000 fps)
Interval: Discrete 0.022s (45.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1680x1050
Interval: Discrete 0.011s (90.000 fps)
Interval: Discrete 0.022s (45.000 fps)
Interval: Discrete 0.033s (30.000 fps)
I did some digging and looking at camera_common.c:camera_common_try_fmt(...)
it looks like it only checks if resolution is correct
...
if (mf->width == frmfmt[i].size.width &&
mf->height == frmfmt[i].size.height) {
...
s_data->mode_prop_idx = i;
...
}
to select the mode. Should it also check pixel format? How is pixel format suppose to be set? Is there an additional camera driver function needed?
Thank you!