Computing the delta time between subsequent gstreamer buffers (pts timestamp) reveals that the timestamps lack precision. For 60Hz the delta varies between 15 and 18ms.
On an i.MX6 developer board we evaluated the pts-timestamps using gstramer and a v4l2src. In this case, the delta between subsequent buffers deviates only a few microseconds [16.7ms - 16.8ms] from the ideal delta of 16.67ms.
Where exactly gets the nvcamerasrc its timestamps from (directly in the kernel space when a hardware interrupt is triggered or does the camera query a timestamp in user space?) and why are they so unprecise?
we could simply distinguish the usage with Tegra ISP, then there are two modes to access camera sensors: VI-bypass (with Tegra ISP) and VI (without Tegra ISP) mode, you may also refer to Camera Architecture Stack to understand the difference.
$ gst-launch-1.0 v4l2src …
low-level kernel drivers use IOCTL calls to access V4L2 functionality directly. you should check VI drivers to understand the pipeline.
FYI, TX1 is using VI2 (vi2_fops.c) driver, and TX2 is based-on VI4 (vi4_fops.c)
$ gst-launch-1.0 nvcamerasrc …
CameraCore library also capture sensor timestamps from sensor hardware interrupt of frame start, there’s CaptureMetadata to store captured frame information and deliver to user-space for usage.
we don’t public sources of CameraCore library, but you can check timestamps with Argus sample codes.
please also refer to L4T Multimedia API Reference for more details.
thanks for sharing your testing results.
according to Camera Architecture Stack, there are two different modes to access camera sensors.
however, your timestamp evaluation is not quite accurate. you should get timestamp with the frame metadata for comparison.
please refer to [L4T Multimedia API Reference] for Argus::ICaptureMetadata,
you may check Argus::ICaptureMetadata::getSensorTimestamp as example.
thank you for the hint. I´ll investigate into this. We actually need the GStreamer plugin to provide the timestamps for us. And writing our own plugin using the Argus Lib is not what we really want to do.
Reviving this thread since I am having the EXACT same issue that marvintx had. My buf.pts timestamps have a lot of jitter to them (and there seems to be some ramp up time too that I’m not sure is nvcamerasrc or gstreamer pipeline related - by “ramp up” time I mean that the first few frames fluctuate wildly before settling on ~30fps).
My pipeline is a bit different in that I’m using the “identity” plugin to tap the pipeline and extract the buf.pts in my callback handler.
However, I see the libargus doc and I have a question:
[b]Is the ICaptureMetadata propagated within gstreamer too as metadata? i.e. Can I get to the ICaptureMetadata timestamp from within gstreamer using nvcamerasrc or do I have to use libargus explicitly (which I can’t)?
I guess a more direct question: How do I get access to the nvidia generated timestamp (nvcamerasrc do-timestamp=true) from within gstreamer?[/b]
Most frames are about 30fps. However, I notice two things:
There is some kind of ramp up/frame drop at the very beginning which causes the “delta” values (time from last frame) to be way off.
In my pipeline I use max-file-duration to roll over capture files. I notice that if I set it to 10 seconds, I don’t get 300 frames per file but it fluctuates wildly (235 frames in one, 280 in another, I might get 300 once, etc.).
I know my wall clock is not that great but it should be relatively accurate at second level precision.
If nvcamerasrc is timestamping them separately and NV wants me to use that, I need to know how - using just libargus is out of the question for this project.
But given nvcamerasrc has a “do-timestamp” option, I would assume it implements as a GstMeta om a per GstBuf basis but I don’t see it.