How to set timestamps in NvVideoEncoder?

Hello everyone,

I’m developing an application that you record 2 cameras synchronously and encode frames to 2 files by combining syncSensor and 01_video_encode examples. It is working except it is missing timestamps so I cannot mux encoded file(.h264) with gstreamer because it expects PTS in the buffer. Muxing files with ffmpeg is working but then the FPS calculation is wrong because I have to set FPS manually to 30 while it should be a number like 29.xx. I tried to add timestamps to the buffer however didn’t work. I uploaded the main.cpp file. Someone can please help me?
main.cpp (32.5 KB)

In raw h264/h265 stream , there is no timestamp information for each buffer. It is put as fps in VUI. So for having accurate timestamp for each frame, you would need to mux the compressed frames into container such as mp4 or mkv. Like the gstreamer command:

$ gst-launch-1.0 -e nvarguscamerasrc ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location=test.mp4

You can use exsting plugins in gstreamer to construct the pipeline. For using jetson_multiemdia_api, would need to do the implementation.

Hi DaneLLL,

Thanks for your reply. I’m using 2 cameras and the gstreamer pipeline outputs are not synchronized. So I have to use libargus and NvVideoEncoder. Normally adding timestamps should be possible because NvVideoEncoder is using v4l2_buffer and there is a timestamp variable in that structure.

As you can see in my main.cpp file I’m setting timestamp values of buffers(I’m still not sure the calculation of timestamp is correct).

However, the PTS values of the resulting file are always none

If you know a better way to record 2 cameras synchronously with precise framerates, please let me know.

There’s demonstration in 01_video_encode sample. Please check the sample and apply it to your code.

Hi DaneLLL,
I’m already using that sample in my main.cpp file as I mentioned in the first entry:

Please add this print to 01_video_encode sample and check if the timestamps are expected.
About the timestamp of video encoder - #4 by DaneLLL

If you can get expected result in running 01 sample, it is supposed to work the same if you apply the same to your code.

Hi DaneLLL,

That print is already there. And it is printing automatically generated values calculated like
1/framerate * number of frames * (second unit)

I don’t know where it is calculated however it should be set from timestamps that are coming from cameras. Even if I set the buffer timestamp in that callback function, the result doesn’t change.

It looks wrong to set timestamps in callback since it is capture plane(encoded frame). You should set


in buffer at output plane(YUV data).

Still you would need to implement mux code to generate mp4 directly. If you save h264 stream and mux it through ffmpeg, it doesn’t have timestamp for each encoded frame.

Hi DaneLLL,

Setting that timestamps are not working. I solved the issue by adding a gstreamer pipeline after the encoder. I’m pushing encoded packets to gstreamer pipeline with appsrc and adding timestamps to that packets. So topic can be closed now. Thanks.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.