Computing the delta time between subsequent gstreamer buffers (pts timestamp) reveals that the timestamps lack precision. For 60Hz the delta varies between 15 and 18ms.
On an i.MX6 developer board we evaluated the pts-timestamps using gstramer and a v4l2src. In this case, the delta between subsequent buffers deviates only a few microseconds [16.7ms - 16.8ms] from the ideal delta of 16.67ms.
Where exactly gets the nvcamerasrc its timestamps from (directly in the kernel space when a hardware interrupt is triggered or does the camera query a timestamp in user space?) and why are they so unprecise?
timestamps were captured from sensor hardware interrupt of frame start.
you may refer to below kernel sources, check timestamps reporting from low-level driver to user-space.
thanks for answering. In the function you cited above it would be important to know where the timestamp (const struct timespec *ts) was generated, respectively who is calling this function?
we could simply distinguish the usage with Tegra ISP, then there are two modes to access camera sensors: VI-bypass (with Tegra ISP) and VI (without Tegra ISP) mode, you may also refer to Camera Architecture Stack to understand the difference.
VI-mode.
$ gst-launch-1.0 v4l2src …
low-level kernel drivers use IOCTL calls to access V4L2 functionality directly. you should check VI drivers to understand the pipeline.
FYI, TX1 is using VI2 (vi2_fops.c) driver, and TX2 is based-on VI4 (vi4_fops.c)
VI-bypass mode.
$ gst-launch-1.0 nvcamerasrc …
CameraCore library also capture sensor timestamps from sensor hardware interrupt of frame start, there’s CaptureMetadata to store captured frame information and deliver to user-space for usage.
we don’t public sources of CameraCore library, but you can check timestamps with Argus sample codes.
please also refer to L4T Multimedia API Reference for more details.
thanks
I attached plots showing the pts-timestamps on the x-axis and the delta between subsequent buffers on the y-axis. I created 3 plots for each mode - VI and VI-bypass.
I used two Leopard Imaging IMX 577 CSI camera to create the images.
The Jetson board was setup using JetPack 3.3.
Code for recording timestamps in VI-Bypass mode (*) ->Framerate 30Hz
After recording I extracted the pts timestamps from ts.txt and plotted them as explained in the beginning of this post.
VI-Bypass
Using () and leaving jetson with default power management
Using () and leaving jetson with default power management plus parallel pipeline to generate system load()
Using () and setting jetson to max power (run ./jetson_clocks.sh) plus parallel pipeline to generate system load() VI mode
Using () and leaving jetson with default power management
Using () and leaving jetson with default power management plus parallel pipeline to generate system load()
Using () and setting jetson to max power (run ./jetson_clocks.sh) plus parallel pipeline to generate system load()
Parallel Pipe ():
thanks for sharing your testing results.
according to Camera Architecture Stack, there are two different modes to access camera sensors.
however, your timestamp evaluation is not quite accurate. you should get timestamp with the frame metadata for comparison.
please refer to [L4T Multimedia API Reference] for Argus::ICaptureMetadata,
you may check Argus::ICaptureMetadata::getSensorTimestamp as example.
thanks
thank you for the hint. I´ll investigate into this. We actually need the GStreamer plugin to provide the timestamps for us. And writing our own plugin using the Argus Lib is not what we really want to do.
Have you tried setting the properties aeLock=true auto-exposure=1 on the nvcamerasrc? Changes in the exposure during capture can affect framerate and timestamps.
Reviving this thread since I am having the EXACT same issue that marvintx had. My buf.pts timestamps have a lot of jitter to them (and there seems to be some ramp up time too that I’m not sure is nvcamerasrc or gstreamer pipeline related - by “ramp up” time I mean that the first few frames fluctuate wildly before settling on ~30fps).
My pipeline is a bit different in that I’m using the “identity” plugin to tap the pipeline and extract the buf.pts in my callback handler.
However, I see the libargus doc and I have a question:
[b]Is the ICaptureMetadata propagated within gstreamer too as metadata? i.e. Can I get to the ICaptureMetadata timestamp from within gstreamer using nvcamerasrc or do I have to use libargus explicitly (which I can’t)?
I guess a more direct question: How do I get access to the nvidia generated timestamp (nvcamerasrc do-timestamp=true) from within gstreamer?[/b]
Most frames are about 30fps. However, I notice two things:
There is some kind of ramp up/frame drop at the very beginning which causes the “delta” values (time from last frame) to be way off.
In my pipeline I use max-file-duration to roll over capture files. I notice that if I set it to 10 seconds, I don’t get 300 frames per file but it fluctuates wildly (235 frames in one, 280 in another, I might get 300 once, etc.).
I know my wall clock is not that great but it should be relatively accurate at second level precision.
If nvcamerasrc is timestamping them separately and NV wants me to use that, I need to know how - using just libargus is out of the question for this project.
But given nvcamerasrc has a “do-timestamp” option, I would assume it implements as a GstMeta om a per GstBuf basis but I don’t see it.
please check developer site, we has update the software stack to fix the timestamp precision issue, please expect JetPack 4.2.1 (l4t-r32.2) will include the fix.
this discussion thread might be too old for tracking, suggest you download the latest JetPack release for testing, open another new topic if you need further support.
thanks