Hi @ShaneCCC ,
Do you have any idea to find and solve the problem ?
The CHANSEL_FAULT tell the PIXEL_RUNAWAY that means NVCSI/VI receive more lines than driver report. Looks line more than 1080
Yes, my sensor sends 2064 lines. I donāt understand why I must reduce the number of lines
Your driver report 1080 lines but sensor output more than it.
You can modify sensor driver to report more than 1080p like 1088 to try.
My sensor output 2064 lines according to the datasheet.
I only changed the information from 2064 to 1080 in the sensor still output 2064 lines. I donāt understand why I have to change this value.
You ask me to change the output resolution of the sensor ?
You can use try repo the line between 1080 - 2064 to narrow down it.
I donāt understand your last sentence.
Why do you want me to test with fewer lines ? The Orin NX is not capable to handle 2064 lines at 67.2 FPS ? The same sensor is working with a Jetson Nano (with the same driver and same parameters)
I made test on MIPI with the oscilloscope and everything seems good.
- Clock is at 67.2 Hz
- Frame is at 67.2Hz
- Duration of one line is around 6.7µs. I count the good number of lines. (2133 but the count method is not exact, it must be 2128)
Lanes are CSI0-CLK, CSI0-D0, CSI0-D1, CSI1-D0, CSI1-D1.
I use the format MEDIA_BUS_FMT_SRGGB12_1X12.
OK, itās my bad. The PIXEL_RUNAWAY means NVCSI receive more pixels instead lines. Please modify the driver to report more pixels(width) to try.
Ok I will check with PIXEL_RUNAWAY.
I found that:
CHANSEL_NOMATCH channel:0x01 frame:1 vi_tstamp:34908796448 data:0x0000000000000269
- no match
- CTYPE = LS (What is that ?)
- DTYPE = 0x13 (19), itās not in the list, why ?
I found the same issue in this post:
They have error with v4l2 so I tried with gstlaunch.
#!/bin/bash
echo "Video"
echo 1 > /sys/kernel/debug/bpmp/debug/clk/vi/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/isp/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/nvcsi/mrq_rate_locked
cat /sys/kernel/debug/bpmp/debug/clk/vi/max_rate |tee /sys/kernel/debug/bpmp/debug/clk/vi/rate
cat /sys/kernel/debug/bpmp/debug/clk/isp/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/isp/rate
cat /sys/kernel/debug/bpmp/debug/clk/nvcsi/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/nvcsi/rate
echo 1 > /sys/kernel/debug/tracing/tracing_on
echo 30720 > /sys/kernel/debug/tracing/buffer_size_kb
echo 1 > /sys/kernel/debug/tracing/events/tegra_rtcpu/enable
echo 1 > /sys/kernel/debug/tracing/events/freertos/enable
echo 2 > /sys/kernel/debug/camrtc/log-level
echo 1 > /sys/kernel/debug/tracing/events/camera_common/enable
echo > /sys/kernel/debug/tracing/trace
v4l2-ctl -V
v4l2-ctl -d /dev/video0 --list-formats
v4l2-ctl -d /dev/video0 --list-formats-ext
gst-launch-1.0 -v \
nvarguscamerasrc do-timestamp=true awblock=true aelock=true sensor-id=0 \
! 'video/x-raw(memory:NVMM),width=2472,height=2064,format=NV12,framerate=672/10' \
! nvvidconv \
! nvv4l2h264enc \
! h264parse \
! qtmux \
! filesink location=test_color_1080p.mp4 -e # working
cat /sys/kernel/debug/tracing/trace
But I have an error that say that I havenāt got any camera:
root@ubuntu:/home/nvidia/test_scripts# ./video_gst.sh
Video
832000000
1011200000
642900000
Format Video Capture:
Width/Height : 2472/2064
Pixel Format : 'RG12' (12-bit Bayer RGRG/GBGB)
Field : None
Bytes per Line : 4944
Size Image : 10204416
Colorspace : sRGB
Transfer Function : Default (maps to sRGB)
YCbCr/HSV Encoding: Default (maps to ITU-R 601)
Quantization : Default (maps to Full Range)
Flags :
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'RG12' (12-bit Bayer RGRG/GBGB)
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'RG12' (12-bit Bayer RGRG/GBGB)
Size: Discrete 2472x2064
Interval: Discrete 14.179s (0.071 fps)
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:751 /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2064, format=(string)NV12, framerate=(fraction)30/1
No cameras available
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2064, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2064, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2064, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay.GstPad:src: caps = video/x-raw(memory:NVMM, meta:GstVideoOverlayComposition), width=(int)2472, height=(int)2064, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM, meta:GstVideoOverlayComposition), width=(int)2472, height=(int)2064, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay.GstPad:video_sink: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2064, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2064, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2064, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2064, format=(string)NV12, framerate=(fraction)30/1
WARNING: from element /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: Pipeline construction is invalid, please add queues.
Additional debug info:
gstbasesink.c(1209): gst_base_sink_query_latency (): /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0:
Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
Got EOS from element "pipeline0".
Execution ended after 0:00:00.171882215
Setting pipeline to NULL ...
Freeing pipeline ...
# tracer: nop
#
# entries-in-buffer/entries-written: 16/16 #P:4
#
# _-----=> irqs-off
# / _----=> need-resched
# | / _---=> hardirq/softirq
# || / _--=> preempt-depth
# ||| / delay
# TASK-PID CPU# |||| TIMESTAMP FUNCTION
# | | | |||| | |
v4l2-ctl-2248 [000] .... 431.633863: tegra_channel_open: vi-output, imx264 spi0.2
v4l2-ctl-2248 [000] .... 431.635256: tegra_channel_close: vi-output, imx264 spi0.2
v4l2-ctl-2249 [002] .... 431.637659: tegra_channel_open: vi-output, imx264 spi0.2
v4l2-ctl-2249 [002] .... 431.638116: tegra_channel_close: vi-output, imx264 spi0.2
v4l2-ctl-2250 [000] .... 431.640888: tegra_channel_open: vi-output, imx264 spi0.2
v4l2-ctl-2250 [000] .... 431.641621: tegra_channel_close: vi-output, imx264 spi0.2
nvargus-daemon-2253 [000] .... 431.706539: tegra_channel_open: vi-output, imx264 spi0.2
nvargus-daemon-2253 [000] .... 431.706631: tegra_channel_close: vi-output, imx264 spi0.2
nvargus-daemon-2253 [000] .... 431.706646: tegra_channel_open: vi-output, imx264 spi0.3
nvargus-daemon-2253 [000] .... 431.706693: tegra_channel_close: vi-output, imx264 spi0.3
nvargus-daemon-2253 [000] .... 431.707531: tegra_channel_open: vi-output, imx264 spi0.2
nvargus-daemon-2253 [000] .... 431.707578: tegra_channel_close: vi-output, imx264 spi0.2
nvargus-daemon-2253 [000] .... 431.707586: tegra_channel_open: vi-output, imx264 spi0.3
nvargus-daemon-2253 [000] .... 431.707630: tegra_channel_close: vi-output, imx264 spi0.3
kworker/3:3-142 [003] .... 431.949907: rtcpu_isp_falcon_task_start: tstamp:965216133 ch:0 task:HANDLE_EVENT
kworker/3:3-142 [003] .... 431.949911: rtcpu_isp_falcon_task_end: tstamp:965216174 task:HANDLE_EVENT
I have another question to be sure:
I use CSI0-CLK, CSI0-D0, CSI0-D1, CSI1-D0, CSI1-D1. for camera 1
I use CSI2-CLK, CSI2-D0, CSI2-D1, CSI2-D0, CSI2-D1. for camera 2
Is my device tree ok with this ? Iām not sure for bus-width
spi0: spi@3210000 {
status = "okay";
/* CS signals are ACTIVE_LOW on qcam5 */
num-cs = <4>;
cs-gpios =
<&tegra_main_gpio TEGRA234_MAIN_GPIO(I, 1) GPIO_ACTIVE_LOW>, /* FPGA 1 */
<&tegra_main_gpio TEGRA234_MAIN_GPIO(P, 0) GPIO_ACTIVE_LOW>, /* FPGA 2 */
<&tegra_main_gpio TEGRA234_MAIN_GPIO(H, 6) GPIO_ACTIVE_LOW>, /* Sensor 1 */
<&tegra_main_gpio TEGRA234_MAIN_GPIO(Q, 5) GPIO_ACTIVE_LOW>; /* Sensor 2 */
spi-max-frequency = <12000000>;
/delete-node/ prod-settings;
me2210_1: me2210@0 {
compatible = "tegra,me2210", "tegra-spidev";
#gpio-cells = <2>;
reg = <0>;
status = "okay";
spi-cpol;
spi-cpha;
spi-max-frequency = <10000000>;
vdd-supply = <&vdd_1v8>;
};
me2210_2: me2210@1 {
compatible = "tegra,me2210", "tegra-spidev";
#gpio-cells = <2>;
reg = <1>;
status = "okay";
spi-cpol;
spi-cpha;
spi-max-frequency = <10000000>;
vdd-supply = <&vdd_1v8>;
};
imx264_1: imx264@2 {
compatible = "sony,imx250_or_imx530";
reg = <2>;
status = "okay";
devnode = "video0";
spi-max-frequency = <12000000>;
sensor_model = "imx264";
post_crop_frame_drop = "0";
use_sensor_mode_id = "true";
pgood-gpios = <&gpio_exp 4 GPIO_ACTIVE_HIGH>; // Added to check if power good is ok before probing the camera
/* from tegra186-quill-camera-imx185-a00.dtsi */
/* Define any required hw resources needed by driver */
/* ie. clocks, io pins, power sources */
/*
* clocks = <&tegra_car TEGRA210_CLK_CLK_OUT_3>;
* clock-names = "clk_out_3";
* clock-frequency = <24000000>;
* mclk = "clk_out_3";
*/
#if 1
clocks = <&bpmp_clks TEGRA234_CLK_EXTPERIPH1>;
clock-names = "extperiph1", "pllp_grtba";
clock-frequency = <24000000>;
mclk = "extperiph1";
#endif
reset-gpios = <&gpio_exp 3 GPIO_ACTIVE_LOW>;
vdd-supply = <&cam0_supply>;
mode0 {
mclk_khz = "37125"; /* p1 */
num_lanes = "4"; /* schema */
tegra_sinterface = "serial_a"; /* schema */
phy_mode = "DPHY"; /* ???? */
discontinuous_clk = "false"; /* DevGuide: no */
dpcm_enable = "false"; /* DevGuide: disabled */
cil_settletime = "0"; /* DevGuide: auto calibration */
lane_polarity = "6"; /* from forum */
dynamic_pixel_bit_depth = "12"; /* p1 */
csi_pixel_bit_depth = "12"; /* p1 */
pixel_phase = "rggb";
pixel_t = "bayer_rggb12";
mode_type = "bayer"; /* p85: RGGB bayer */
readout_orientation = "0"; /* DevGuide: normal */
inherent_gain = "1"; /* DevGuide: Set to 1 */
/*mclk_multiplier = "40"; Deprecated */
line_length = "2472"; /* p85: 12+2448+12=2472 */
pix_clk_hz = "353499955"; /* DevGuide pixel_clk_hz = sensor output size x frame rate = 2472 * 2128 * 67.2 = 342868377,6 */
active_w = "2472"; /* p85: Active frame */
active_h = "1024"; /* p85: Active frame 2064 */
min_gain_val = "0";
max_gain_val = "48";
gain_step_pitch = "0.1";
min_hdr_ratio = "1";
max_hdr_ratio = "64";
min_framerate = "10";
max_framerate = "67.2";
default_framerate = "67.2"; /* p1: 67.2 frame/s in 12 bits */
min_exp_time = "16.165";
max_exp_time = "165770";
embedded_metadata_height = "4"; /* p85: Embedded Data Line (EBD) */
};
controller-data {
nvidia,tx-clk-tap-delay = <16>;
nvidia,rx-clk-tap-delay = <0>;
nvidia,cs-setup-clk-count = <16>;
nvidia,cs-hold-clk-count = <0>;
};
ports {
#address-cells = <1>;
#size-cells = <0>;
port@0 {
reg = <0>;
macq_qcam5_out0: endpoint {
/* !!! index of the csi input port */
port-index = <0>;
bus-width = <4>;
remote-endpoint = <&macq_qcam5_csi_in0>;
};
};
};
};
imx264_2: imx264@3 {
compatible = "sony,imx250_or_imx530";
reg = <3>;
status = "okay";
devnode = "video1";
spi-max-frequency = <12000000>;
sensor_model = "imx264";
post_crop_frame_drop = "0";
use_sensor_mode_id = "true";
pgood-gpios = <&gpio_exp 7 GPIO_ACTIVE_HIGH>; // Added to check if power good is ok before probing the camera
/* from tegra186-quill-camera-imx185-a00.dtsi */
/* Define any required hw resources needed by driver */
/* ie. clocks, io pins, power sources */
/*
* clocks = <&tegra_car TEGRA210_CLK_CLK_OUT_3>;
* clock-names = "clk_out_3";
* clock-frequency = <24000000>;
* mclk = "clk_out_3";
*/
#if 1
clocks = <&bpmp_clks TEGRA234_CLK_EXTPERIPH1>;
clock-names = "extperiph1", "pllp_grtba";
clock-frequency = <24000000>;
mclk = "extperiph1";
#endif
reset-gpios = <&gpio_exp 6 GPIO_ACTIVE_LOW>;
vdd-supply = <&cam1_supply>;
mode0 {
mclk_khz = "37125"; /* p1 */
num_lanes = "4"; /* schema */
tegra_sinterface = "serial_c"; /* schema */
phy_mode = "DPHY"; /* ???? */
discontinuous_clk = "false"; /* DevGuide: no */
dpcm_enable = "false"; /* DevGuide: disabled */
cil_settletime = "0"; /* DevGuide: auto calibration */
lane_polarity = "6"; /* from forum */
dynamic_pixel_bit_depth = "12"; /* p1 */
csi_pixel_bit_depth = "12"; /* p1 */
pixel_phase = "rggb";
pixel_t = "bayer_rggb12";
mode_type = "bayer"; /* p85: RGGB bayer */
readout_orientation = "0"; /* DevGuide: normal */
inherent_gain = "1"; /* DevGuide: Set to 1 */
/*mclk_multiplier = "40"; Deprecated */
line_length = "2472"; /* p85: 12+2448+12=2472 */
pix_clk_hz = "353499955"; /* DevGuide pixel_clk_hz = sensor output size x frame rate = 2472 * 2128 * 67.2 = 342868377,6 */
active_w = "2472"; /* p85: Active frame */
active_h = "1024"; /* p85: Active frame 2064 */
min_gain_val = "0";
max_gain_val = "48";
gain_step_pitch = "0.1";
min_hdr_ratio = "1";
max_hdr_ratio = "64";
min_framerate = "10";
max_framerate = "67.2";
default_framerate = "67.2"; /* p1: 67.2 frame/s in 12 bits */
min_exp_time = "16.165";
max_exp_time = "165770";
embedded_metadata_height = "4"; /* p85: Embedded Data Line (EBD) */
};
controller-data {
nvidia,tx-clk-tap-delay = <16>;
nvidia,rx-clk-tap-delay = <0>;
nvidia,cs-setup-clk-count = <16>;
nvidia,cs-hold-clk-count = <0>;
};
ports {
#address-cells = <1>;
#size-cells = <0>;
port@0 {
reg = <0>;
macq_qcam5_out1: endpoint {
/* !!! index of the csi input port */
port-index = <2>;
bus-width = <4>;
remote-endpoint = <&macq_qcam5_csi_in1>;
};
};
};
};
};
tegra-capture-vi {
num-channels = <2>;
ports {
#address-cells = <1>;
#size-cells = <0>;
port@0 {
reg = <0>;
macq_qcam5_vi_in0: endpoint {
port-index = <0>;
bus-width = <4>;
remote-endpoint = <&macq_qcam5_csi_out0>;
};
};
port@1 {
reg = <1>;
macq_qcam5_vi_in1: endpoint {
port-index = <2>;
bus-width = <4>;
remote-endpoint = <&macq_qcam5_csi_out1>;
};
};
};
};
host1x@13e00000 {
nvcsi@15a00000 {
num-channels = <2>;
#address-cells = <1>;
#size-cells = <0>;
channel@0 {
reg = <0>;
ports {
#address-cells = <1>;
#size-cells = <0>;
port@0 {
reg = <0>;
macq_qcam5_csi_in0: endpoint@0 {
port-index = <0>;
bus-width = <4>;
remote-endpoint = <&macq_qcam5_out0>;
};
};
port@1 {
reg = <1>;
macq_qcam5_csi_out0: endpoint@1 {
remote-endpoint = <&macq_qcam5_vi_in0>;
};
};
};
};
channel@1 {
reg = <1>;
ports {
#address-cells = <1>;
#size-cells = <0>;
port@0 {
reg = <0>;
macq_qcam5_csi_in1: endpoint@2 {
port-index = <2>;
bus-width = <4>;
remote-endpoint = <&macq_qcam5_out1>;
};
};
port@1 {
reg = <1>;
macq_qcam5_csi_out1: endpoint@3 {
remote-endpoint = <&macq_qcam5_vi_in1>;
};
};
};
};
};
};
tegra-camera-platform {
compatible = "nvidia, tegra-camera-platform";
/**
* Physical settings to calculate max ISO BW
*
* num_csi_lanes = <>;
* Total number of CSI lanes when all cameras are active
*
* max_lane_speed = <>;
* Max lane speed in Kbit/s
*
* min_bits_per_pixel = <>;
* Min bits per pixel
*
* vi_peak_byte_per_pixel = <>;
* Max byte per pixel for the VI ISO case
*
* vi_bw_margin_pct = <>;
* Vi bandwidth margin in percentage
*
* max_pixel_rate = <>;
* Max pixel rate in Kpixel/s for the ISP ISO case
*
* isp_peak_byte_per_pixel = <>;
* Max byte per pixel for the ISP ISO case
*
* isp_bw_margin_pct = <>;
* Isp bandwidth margin in percentage
*/
num_csi_lanes = <8>;
max_lane_speed = <5300000>;
min_bits_per_pixel = <12>;
vi_peak_byte_per_pixel = <2>;
vi_bw_margin_pct = <25>;
max_pixel_rate = <7500000>;
isp_peak_byte_per_pixel = <5>;
isp_bw_margin_pct = <25>;
/**
* The general guideline for naming badge_info contains 3 parts, and is as follows,
* The first part is the camera_board_id for the module; if the module is in a FFD
* platform, then use the platform name for this part.
* The second part contains the position of the module, ex. "rear" or "front".
* The third part contains the last 6 characters of a part number which is found
* in the module's specsheet from the vendor.
*/
modules {
module0 {
badge = "left_sensor_0";
position = "rear";
orientation = "0";
drivernode0 {
pcl_id = "v4l2_sensor";
devname = "imx264-left";
proc-device-tree = "/proc/device-tree/spi@3210000/imx264@2";
};
};
module1 {
badge = "right_sensor_0";
position = "front";
orientation = "1";
drivernode0 {
pcl_id = "v4l2_sensor";
devname = "imx264-right";
proc-device-tree = "/proc/device-tree/spi@3210000/imx264@3";
};
};
};
};
Hi @ShaneCCC , do you have some ideas?
Your dts is correct for 4 lanes configure.
Good news, so do you have ideas about my problem ?
Current donāt support 0x13 embedded data type, only support 0x12
Yes, like said in this post Capturing from IMX568 via V4L2, 0x13 embedded data type is sent by Sony sensor and we canāt change this. As I understood, this packet is dropped, but the line image come after this. In the post, I saw that itās not working with v4l2 but itās working with gst-launch.
So, do you know why I have āNo camera availableā when I try to launch the gst-launch function from the last post ?
That could be the context of tegra-camera-platform{} have problem.
Ok, and what can be the problem ? I shared you tegra-camera-platform{} in my device tree.
Could be in Jul 6. But maybe could be postponed.
Thanks
I donāt understand your last response
OOPS reply to the wrong topic.
The devname in your dts is incorrect.