How to Retrieve Accurate Timestamps from Buffers Without Inference Metadata in GStreamer HEVC Branch

• Hardware Platform (Jetson / GPU) : NVIDIA Jetson AGX Orin
• DeepStream Version : 7.1
• JetPack Version (valid for Jetson only) : 6.1
• TensorRT Version : 8.6.2.3
• Issue Type( questions, new requirements, bugs) : question

Hello,

I’m encountering a challenge while saving H.265 (HEVC) video fragments in a specific branch of my DeepStream pipeline. In my pipeline, I have a branch that looks like this:

This branch is responsible for saving .hevc files every 60 frames. However, there is no inference happening here, so frame_meta.ntp_timestamp is unavailable.

To approximate a global time reference, I currently:

  1. Set a pipeline_start_time before running the pipeline:
self.logger.info("----- Starting pipeline -----")
# Start pipeline
self.pipeline.set_state(Gst.State.PLAYING)
self.pipeline_start_time = int(time.time_ns() / 1_000_000)
self.loop.run()
  1. Later, when saving files, I approximate the timestamp as pipeline_start_time + buffer PTS.
self.pipeline_start_time + gst_buffer.pts

Issue:

This workaround is inaccurate.

There’s a non-deterministic delay (around 20–30 ms) between setting pipeline_start_time and actual playback starting, causing a timestamp shift of about 2–3 frames.

Thus, the timestamps of the saved HEVC files are not precisely correct.

Question:

  • Is there a way to retrieve a real UNIX timestamp directly from the GstBuffer without needing inference metadata?
  • Is there a hidden or standard property (besides pts/dts) that provides real-time info at the buffer level?
  • Or is there a better method to “synchronize” the system clock with the buffer timestamps accurately?

I want to avoid relying on a global pipeline start time and make the saved file timestamps truly accurate, similar to how ntp_timestamp is available in inference frame_meta.

Current appsink Callback Code:

Here’s my current implementation for appsink:

def on_new_hevc_sample(
    appsink: Gst.Element,
    hevc_probe_data: dict,
    logger: logging.Logger,
    fps_stats: dict | None = None,
) -> Gst.FlowReturn:
    """
    Callback function for appsink new-sample signal.
    Handles HEVC samples and saves them to files.
    """
    # Get the sample from appsink
    sample = appsink.emit("pull-sample")
    if not sample:
        logger.error("Unable to get sample from hevc appsink")
        return Gst.FlowReturn.ERROR

    # Get buffer from sample
    gst_buffer = sample.get_buffer()
    if not gst_buffer:
        logger.error("Unable to get GstBuffer from hevc sample")
        return Gst.FlowReturn.ERROR

    # Increment frame counter
    hevc_probe_data["hevc_frame_counter"] += 1

    # Store PTS of first frame in the chunk if this is the first frame after reset
    if hevc_probe_data["hevc_frame_counter"] % 60 == 1:
        hevc_probe_data["first_frame_pts"] = gst_buffer.pts

    # Get buffer data
    buffer_data = gst_buffer.extract_dup(0, gst_buffer.get_size())

    # Add buffer data to memory chunks
    hevc_probe_data["memory_chunks"].append(buffer_data)

    # Check if we need to write to file (every 60 frames)
    if hevc_probe_data["hevc_frame_counter"] % 60 == 0:
        # Construct filename
        filename = f"{hevc_probe_data['dirs']['temp']}/v_temp{hevc_probe_data['hevc_file_index']:06d}.hevc"

        # Write all chunks at once
        try:
            with open(filename, "wb") as f:
                # Write all accumulated chunks in one operation
                for chunk in hevc_probe_data["memory_chunks"]:
                    f.write(chunk)

            # Create a structure with file information
            structure = Gst.Structure.new_empty("custom-h265-fragment-closed")
            structure.set_value("location", filename)
            # Use the first frame's PTS instead of the last frame
            structure.set_value(
                "hevc-pts", str(int(hevc_probe_data["first_frame_pts"] / 1_000_000))
            )

            # Create and post the message
            appsink.post_message(Gst.Message.new_element(appsink, structure))

            # Reset for next file
            hevc_probe_data["hevc_file_index"] += 1
            hevc_probe_data["memory_chunks"] = []
            hevc_probe_data["first_frame_pts"] = None

        except IOError as e:
            logger.error(f"Failed to write video buffer to {filename}: {e}")

    return Gst.FlowReturn.OK

Additional Thoughts:

  • I’ve reviewed the buffer metadata but only found pts, dts, and buffer duration—none of them maps directly to an absolute UNIX timestamp.
  • The HEVC branch does not interact with nvinfer or similar components, so ntp_timestamp from DeepStream is not available.
  • I considered using clock time via Gst.Clock from the pipeline, but unsure how reliable it is vs. buffer timestamps.

Final Request:

Any tips, suggestions, or pointers on:

  • How to accurately align system time with buffer PTS ,
  • Extracting real-time timestamps at the buffer level,
  • Handling timestamp correction in pure GStreamer pipelines without inference?

Thanks a lot for your help and for reading!

gst_buffer.pts is the exact accurate timestamp. Why do you need to get an absolute UNIX time?

@Fiona.Chen thank you for your response!

My goal is to save each .hevc file with an absolute UNIX timestamp in its filename — for example, 1745917072.hevc instead of 000001.hevc.

Is there a way to extract a real UNIX timestamp directly from the GstBuffer? Or do I need to compute it manually each time, for example using:

int(time.time_ns() / 1_000_000)

I’m trying to avoid mismatches between buffer timestamps and system time if possible.

Any guidance would be appreciated!

If you are using nvarguscamerasrc, the GstBuffer pts GstBuffer is the exact time. The format is GstClockTime. You can google for how to conver the GstClockTime to the format you need.

@Fiona.Chen thank you for your reply. So you are telling me, that these values of the DTS can be converted to UNIX timestamp like this: 1745917072? How is it possible?

Moreover, is there a Python binding that will enable me to do this? I have my DeepStream application written in Python.

No. PTS is what you need.

The GstBuffer and GstClockTime are all defined by Gstreamer project. Please google for the python bindings of GStreamer by yourself.

@Fiona.Chen I still do not understand how this can resolve my issue. In my case PTS and DTS are indistinguishable, they are the same:

These are values representing the presentation of the frame relative to start of the pipeline, right? Your suggestion is to extract unix timestamp from GstClockTime and to it pts and this is my obtained unix timestamp value?

In my case with nvarguscamerasrc element, PTS does not look like UNIX timestamp at all.

EDIT:
I added these 2 lines:

clock = Gst.SystemClock.obtain()
current_time = clock.get_time()
print(current_time)

once I print this value, it is completely incorrect: 28044592353422. My system date, once i type date in terminal displays correct value however Gst.SystemClock is out of nowhere.

No. The GstBuffer pts from nvarguscamera is the system time.

The GstClockTime is a 64 bits integer. Please google the method of converting GstClockTime to the human readable format.

@Fiona.Chen I have implemented such a probe function which I attached to src of nvarguscamerasrc element:

def camera_pts_probe(
    pad: Gst.Pad, info: Gst.PadProbeInfo, user_data: dict
) -> Gst.PadProbeReturn:
    """Extract PTS from nvarguscamerasrc element"""
    logger = user_data.get("logger")

    # Get the buffer from the probe info
    gst_buffer = info.get_buffer()
    if not gst_buffer:
        logger.error("Unable to get GstBuffer from probe")
        return Gst.PadProbeReturn.OK

    # Extract the PTS value
    pts = gst_buffer.pts

    # Log or process the PTS value
    logger.info(f"Camera PTS: {pts} ns, {pts / Gst.SECOND:.6f} s")

    return Gst.PadProbeReturn.OK

The values of pts i get are as follows:

I put this value 528457480 to epoch converter. This is what I get:
GMT : Saturday, April 27, 1985 1:30:50 PM
How can this timestamp represent the system time. This PTS value is time starting from 0 till obtaining a frame. If you convert this value from nanoseconds to seconds, you get: 4834566500.483456650 s. Next value is 4988537620.498853762. Subtracting 498853762 - 483456650 = 15 397 112 which is time difference between frames.

Moreover, current UNIX timestamp starts from 1745 no 4834 or 6452.

I am not sure whether you understood my question.

My actual question:

How can I retrieve the real-time UNIX timestamp (e.g., wall-clock time) corresponding to a frame’s arrival or decoding in the pipeline?

In other words, how do I obtain the actual system time (e.g., time_t or gettimeofday() value) at which a frame was received or processed, instead of the relative PTS? PTS is not representing the system time.

please see-also Topic 159220 to obtain timestamps.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.