nvcamerasrc timestamps - precision lack


To evaluate the timestamps of the nvcamerasrc on a TX2 I used the following pipeline with a Leopard IMX334 CSI camera:

gst-launch-1.0 nvcamerasrc fpsRange="60.0 60.0" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, framerate=(fraction)60/1' ! fakesink -e

Computing the delta time between subsequent gstreamer buffers (pts timestamp) reveals that the timestamps lack precision. For 60Hz the delta varies between 15 and 18ms.

On an i.MX6 developer board we evaluated the pts-timestamps using gstramer and a v4l2src. In this case, the delta between subsequent buffers deviates only a few microseconds [16.7ms - 16.8ms] from the ideal delta of 16.67ms.

Where exactly gets the nvcamerasrc its timestamps from (directly in the kernel space when a hardware interrupt is triggered or does the camera query a timestamp in user space?) and why are they so unprecise?

Thank you

hello marvintx,

timestamps were captured from sensor hardware interrupt of frame start.
you may refer to below kernel sources, check timestamps reporting from low-level driver to user-space.


void set_timestamp(struct tegra_channel_buffer *buf,
                        const struct timespec *ts)
        buf->buf.timestamp.tv_sec = ts->tv_sec;
        buf->buf.timestamp.tv_usec = ts->tv_nsec / NSEC_PER_USEC;

Hi JerryChang,

thanks for answering. In the function you cited above it would be important to know where the timestamp (const struct timespec *ts) was generated, respectively who is calling this function?

hello marvintx,

we could simply distinguish the usage with Tegra ISP, then there are two modes to access camera sensors: VI-bypass (with Tegra ISP) and VI (without Tegra ISP) mode, you may also refer to Camera Architecture Stack to understand the difference.

$ gst-launch-1.0 v4l2src
low-level kernel drivers use IOCTL calls to access V4L2 functionality directly. you should check VI drivers to understand the pipeline.
FYI, TX1 is using VI2 (vi2_fops.c) driver, and TX2 is based-on VI4 (vi4_fops.c)


VI-bypass mode.
$ gst-launch-1.0 nvcamerasrc
CameraCore library also capture sensor timestamps from sensor hardware interrupt of frame start, there’s CaptureMetadata to store captured frame information and deliver to user-space for usage.
we don’t public sources of CameraCore library, but you can check timestamps with Argus sample codes.
please also refer to L4T Multimedia API Reference for more details.

Hello JerryChang,

thank you for the suggestion. I checked the system in VI-mode.

It appears that the timestamps in VI-mode are more precise than in VI-bypass mode. Especially if the TX2 is not running at max clock rate.

hello marvintx,

It appears that the timestamps in VI-mode are more precise than in VI-bypass mode.
could you please share the side-by-side comparison results.

please also share what’s your environment settings, we’ll tried to investigate this internally.
for example,

  1. which JetPack release you’re working with.
  2. which sensor you’re used for testing, please also share the resolution and frame-rate.
  3. both the commands in VI-mode and VI-bypass mode to access camera sensors.

Hello Jerry Chang,

I attached plots showing the pts-timestamps on the x-axis and the delta between subsequent buffers on the y-axis. I created 3 plots for each mode - VI and VI-bypass.

I used two Leopard Imaging IMX 577 CSI camera to create the images.

The Jetson board was setup using JetPack 3.3.

Code for recording timestamps in VI-Bypass mode (*) ->Framerate 30Hz

$time gst-launch-1.0 -ve nvcamerasrc num-buffers=1000 fpsRange="30.0 30.0" sensor-id=1 ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! fakesink silent=false 2>&1 > ts.txt

Code for recording timestamps in VI mode (**) → Framerate around 43Hz

$time gst-launch-1.0 -ve v4l2src do-timestamp=true num-buffers=1500 device=/dev/video1 ! 'video/x-bayer, width=(int)1920, height=(int)1080' ! fakesink silent=false 2>&1 > ts.txt

After recording I extracted the pts timestamps from ts.txt and plotted them as explained in the beginning of this post.

Using () and leaving jetson with default power management

Using (
) and leaving jetson with default power management plus parallel pipeline to generate system load()

Using (
) and setting jetson to max power (run ./jetson_clocks.sh) plus parallel pipeline to generate system load(

VI mode
Using (
) and leaving jetson with default power management

Using (
) and leaving jetson with default power management plus parallel pipeline to generate system load()

Using (
) and setting jetson to max power (run ./jetson_clocks.sh) plus parallel pipeline to generate system load()

Parallel Pipe (

hello marvintx,

thanks for sharing your testing results.
according to Camera Architecture Stack, there are two different modes to access camera sensors.

however, your timestamp evaluation is not quite accurate. you should get timestamp with the frame metadata for comparison.
please refer to [L4T Multimedia API Reference] for Argus::ICaptureMetadata,
you may check Argus::ICaptureMetadata::getSensorTimestamp as example.

Hi JerryChang,

thank you for the hint. I´ll investigate into this. We actually need the GStreamer plugin to provide the timestamps for us. And writing our own plugin using the Argus Lib is not what we really want to do.

is there a code that addresses the timestamps issue with argus camera sample?

Hi marvintx,

Have you tried setting the properties aeLock=true auto-exposure=1 on the nvcamerasrc? Changes in the exposure during capture can affect framerate and timestamps.

Reviving this thread since I am having the EXACT same issue that marvintx had. My buf.pts timestamps have a lot of jitter to them (and there seems to be some ramp up time too that I’m not sure is nvcamerasrc or gstreamer pipeline related - by “ramp up” time I mean that the first few frames fluctuate wildly before settling on ~30fps).

My pipeline is a bit different in that I’m using the “identity” plugin to tap the pipeline and extract the buf.pts in my callback handler.

However, I see the libargus doc and I have a question:

[b]Is the ICaptureMetadata propagated within gstreamer too as metadata? i.e. Can I get to the ICaptureMetadata timestamp from within gstreamer using nvcamerasrc or do I have to use libargus explicitly (which I can’t)?

I guess a more direct question: How do I get access to the nvidia generated timestamp (nvcamerasrc do-timestamp=true) from within gstreamer?[/b]

Here is what I speak of:

nvcamerasrc do-timestamp=True enable-meta=True autoexposure=1 aeLock=True ! identity ! …

Now per frame callback from the identity plugin (“handoff”):

“delta”: 0,
“frame”: 0,
“timestamp”: 710794680
“delta”: 31127306,
“frame”: 1,
“timestamp”: 741921986
“delta”: 8828365,
“frame”: 2,
“timestamp”: 750750351
“delta”: 21636578,
“frame”: 3,
“timestamp”: 772386929
“delta”: 32609730,
“frame”: 4,
“timestamp”: 804996659
“delta”: 33260030,
“frame”: 5,
“timestamp”: 838256689
“delta”: 32897792,
“frame”: 6,
“timestamp”: 871154481

Most frames are about 30fps. However, I notice two things:

  1. There is some kind of ramp up/frame drop at the very beginning which causes the “delta” values (time from last frame) to be way off.

  2. In my pipeline I use max-file-duration to roll over capture files. I notice that if I set it to 10 seconds, I don’t get 300 frames per file but it fluctuates wildly (235 frames in one, 280 in another, I might get 300 once, etc.).

I know my wall clock is not that great but it should be relatively accurate at second level precision.

If nvcamerasrc is timestamping them separately and NV wants me to use that, I need to know how - using just libargus is out of the question for this project.

But given nvcamerasrc has a “do-timestamp” option, I would assume it implements as a GstMeta om a per GstBuf basis but I don’t see it.

hello alex.sack,


  1. according to Multimedia API, the usage of legacy nvcamerasrc is deprecated.
  2. please check Camera Software Development Solution chapter, you should working with nvarguscamerasrc plugin instead.
  3. please check developer site, we has update the software stack to fix the timestamp precision issue, please expect JetPack 4.2.1 (l4t-r32.2) will include the fix.
  4. this discussion thread might be too old for tracking, suggest you download the latest JetPack release for testing, open another new topic if you need further support.
  1. Uh, that’s not what I read. I read this: Usage of legacy nvcamerasrc with the nvgstcapture application is deprecated. That’s not what I’m doing

  2. It doesn’t offer autoexposure, fpsRange, and a slew of other options I need in my GST pipeline.

  3. Where on this page?

  4. I can’t because the camera is not supported on it. But I will open a new bug if the above patch doesn’t fix it.