Thanks for your support. There is no change in trace log still its showing time stamp value abnormal same as post #27 .
I will modify Vi driver.
For custom sensor access I am using following commands and yavta c code which I have attached with this.
v4l2-ctl -d /dev/video0 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=1 --stream-to=t.raw
After increasing chan->timeout, abnormal timestamp vale gone but its not reading data and showing same control error. I have attached my trace log.
I have a doubt. My custom sensor can send maximum 4 row of data per second
may I know what’s the pixel formats reported by v4l2 standard controls.
i.e. $ v4l2-ctl -d /dev/video0 --list-formats-ext
there’s default settings of Tegra video input device; please refer to below kernel sources for the minimum and maximum width and height common settings.
for example, $L4T_Sources/r32.4.3/Linux_for_Tegra/source/public/kernel/nvidia/include/media/tegra_camera_core.h
you may also check VI driver,
there’s function to clamp the active dimension with definition settings.
for example, $L4T_Sources/r32.4.3/Linux_for_Tegra/source/public/kernel/nvidia/drivers/media/platform/tegra/camera/vi/channel.c
Thanks for your response. The pixel formats reported by v4l2 standard controls.
v4l2-ctl -d /dev/video0 --list-formats-ext
Index : 0
Type : Video Capture
Pixel Format: ‘RGGB’
Name : 8-bit Bayer RGRG/GBGB
Size: Discrete 256x1
Interval: Discrete 0.250s (4.000 fps)
Jerry I am getting two types of register error with frame syncpt error. When am running my custom sensor in continous clock, register reading Multi bit SOT error(TEGRA_CSI_CILX_STATUS: 0x00020020) with frame syncpt timeout and next scenario is when am running my custom sensor in discontinous clock register reading control clock error(TEGRA_CSI_CILX_STATUS: 0x00000001) with frame syncpt timeout.
Jerry could you give me any idea where will be the issue.
Thanks for your response. There is no change in error after reducing minimum height to 1 in VI driver.
Jerry, Is it possible to read that error packet and print in dmesg before it get discard. Is there anyway to print that error packet.
I have updated LP clock(cil_clk_mhz= 50) in csi2_fop.c driver file as 50MHz as per our sensor spec. After that, SOT multi bit error in data_lane 0 got cleared and now it’s generating error only in data_lane 1. I have attached driver and dmesg screen shot.
My doubt is while we are changing cil_clk_mhz = 50, do we need to change csi_clk_mhz as 50MHZ or not?
If so, how to calculate csi_clk_mhz?
After changing custom sensor data rate to 1.3GBps and in driver chan->timeout = msecs_to_jiffies(1000), I can able to read My first packet. Thanks for your support.
Now issue is I can able to read only 1fps and My sensor sending 4 packets per second but I can read only first packet. I have a doubt that why my sensor need chan->timeout = msecs_to_jiffies(1000) and imx219 just enough chan->timeout = msecs_to_jiffies(200).
Could you please give me any idea to debug it. Thanks a lot.
I have attached my dmesg.
there’re frame_start and mw_ack_done for waiting hardware sync-points of frame-start and frame-end sensor signaling.
the hardware sync-points using different index to indicate this, for example, chan->syncpt[index] for frame_start; chan->syncpt[index] for mw_ack_done.
it’ll only looking for frame_start if you’re using single thread approach for frame captures,
static int tegra_channel_capture_frame(struct tegra_channel *chan,
struct tegra_channel_buffer *buf)
ret = tegra_channel_capture_frame_multi_thread(chan, buf);
ret = tegra_channel_capture_frame_single_thread(chan, buf);
that MW_ACK_DONE sync-point timeout error should be caused by waiting timeout for the last frame to terminate the capture process.
you might have a try to increase the timeout limitation for verification.
Thanks for information. I have tried multi thread approach also, but result is same, it’s reading only 1fps. Is any other solution for reading 4fps. And why its reading only one frame and Is I want to enable any register for interlaced support. Thanks.
all the reference camera driver we’re currently had were based-on continuous mode.
since the camera driver development is the same as continuous mode camera, but frame input is different.
is it possible to configure it as continuous mode for verification? may I know what’s the use-case of interlaced more.
Thanks for your response. I will try to configure My sensor as continuous mode. For continuous mode only I mentioned as interlaced mode. My sensor sending 4 packet in one second but I found that driver is taking one second to processing first packet itself that why it’s missing all other packet and always reading first packet only.
Could you give any Idea why it’s taking one second(chan->timeout = 250) for reading packet. Thanks for your time.
It contain just Hex character. Example packet one contain 0xFF (256 bytes), packet 2 contain 0x00( 256 ) packet 3 contain 0xCC( 256 bytes) and packet 4 contain 0xDD (256 bytes). When my sensor sending this packet to jetson nano I can read only 0xFF(256 bytes). Could you please suggest any idea to debug it.
Thank for your support.