The camera usually has an interval of 33322us or 33323us, but there may be longer or shorter intervals. For example, 33327 or 33318 us.
Long-term accumulated offset will have a particularly large impact on our fusion algorithm because we do not know the exact time when the camera receives the sof.
Please take a look, we are in a hurry
Hello, I confirmed it in the .c file you mentioned
Blockquote
wake_up_interruptible(&chan->start_wait);
/* Read SOF from capture descriptor /
ts = ns_to_timespec64((s64)descr->status.sof_timestamp);
trace_tegra_channel_capture_frame(“sof”, &ts);
vb->vb2_buf.timestamp = descr->status.sof_timestamp;
if (frame_err)
buf->vb2_state = VB2_BUF_STATE_ERROR;
else
buf->vb2_state = VB2_BUF_STATE_DONE;
/ Read EOF from capture descriptor */
ts = ns_to_timespec64((s64)descr->status.eof_timestamp);
trace_tegra_channel_capture_frame(“eof”, &ts);
I have currently eliminated the camera instability using an oscilloscope.
When the input signal is 10 frames of camera image data, the time deviation of each frame is about plus or minus 15us.
When the input signal is 30 frames, the deviation of each frame is about 5us. The above deviations are all generated randomly.
So which part records the timestamp of sof or eof? Can we modify it to tsc timestamp (our system will be connected to gptp calibration)? Secondly, I have another doubt. The interrupt priority of mipi related modules is not enough. Can we users bind the core or increase the interrupt priority by ourselves?
Since the timestamp of v4l2 is us, the rec timestamp will be compressed. To avoid compression, multiply the timestamp by 1000.The purpose is to get a ns timestamp
Blockquote
wake_up_interruptible(&chan->start_wait);
/* Read SOF from capture descriptor */
ts = ns_to_timespec64((s64)descr->status.sof_timestamp);
trace_tegra_channel_capture_frame(“sof”, &ts);
vb->vb2_buf.timestamp = (descr->status.sof_timestamp) * 1000;
I’m currently testing 3 cameras (different types, all at 30fps) at the same time, and I found that the timestamps of the cameras are at the same time, but the frequency is accelerated. You can clearly see the change in the time interval of the 3 cameras around 2079532561748ns.
You can take a look at my latest comment. I think it is due to some reasons of rtcpu that the timestamp is inaccurate. Secondly, the camera cannot embed time information, it can only embed some parameters such as exposure and frame number. So the sof and eof timestamps are very important to me.
Try get the time interval after wait_for_completion_timeout() in the vi_capture_status() that RTCPU report get the SOF from the sensor instead of check timestamp.
Blockquot
int vi_capture_status(
struct tegra_vi_channel *chan,
int32_t timeout_ms)
{
struct vi_capture *capture = chan->capture_data;
int ret = 0;
nv_camera_log(chan->ndev,
__arch_counter_get_cntvct(),
NVHOST_CAMERA_VI_CAPTURE_STATUS);
if (capture == NULL) {
dev_err(chan->dev,
“%s: vi capture uninitialized\n”, func);
return -ENODEV;
}
if (capture->channel_id == CAPTURE_CHANNEL_INVALID_ID) {
dev_err(chan->dev,
“%s: setup channel first\n”, func);
return -ENODEV;
}
dev_dbg(chan->dev, “%s: waiting for status, timeout:%d ms\n”, func, timeout_ms);
if (timeout_ms < 0) {
wait_for_completion(&capture->capture_resp);
} else {
ret = wait_for_completion_timeout(
&capture->capture_resp,
msecs_to_jiffies(timeout_ms));
if (ret == 0) {
dev_dbg(chan->dev,
“capture status timed out\n”);
return -ETIMEDOUT;
}
}
Are you referring to this code? If you record the timestamp here, the timestamp will be affected by the system scheduling and the difference will be huge.
@ShaneCCC
I found a problem while calculating the timestamp, and I’ll take the attached log as an example.
Frame start
[2024-11-19 15:11:36.181] [video6dbg] [debug] 81720766354
Frame end
[2024-11-19 15:13:15.319] [video6dbg] [debug] 81819912945
The first part is the timestamp of the application layer, and we can calculate an interval
15:13:15.319 - 15:11:36.181 = 1:39.138 = 99.138s
81819912945 – 81720766354 = 99.146591s
The result is an incomprehensible result, which is 8ms faster. So I captured a large section of time, nearly two hours,
When running the program, only one 640x512 camera is working. There are no other programs.
In user mode, I have a program that reads camera data based on v4l2. The program will record the SOF timestamp in the v4l2buffer of the current image (that is, the SOF timestamp of RCE) and also print a system timestamp synchronized by PTP.
The program runs for two hours and prints the head and tail of the log.
Frame start utc time ---------------------------------------------------v4l2 timestamp
[2024-11-19 15:00:00.017] [video6dbg] [debug] 81024525215,
…
Frame end utc time ---------------------------------------------------v4l2 timestamp
[2024-11-19 17:00:00.020] [video6dbg] [debug] 88224690900
utc end time - utc start time
17:00:00.020 - 15:00:00.017 = 2:0:0.003 = 7200.003s v4l2 end time - v4l2 start time
88224690900 – 81024525215 = 7200.165685