I’m developing an application that you record 2 cameras synchronously and encode frames to 2 files by combining syncSensor and 01_video_encode examples. It is working except it is missing timestamps so I cannot mux encoded file(.h264) with gstreamer because it expects PTS in the buffer. Muxing files with ffmpeg is working but then the FPS calculation is wrong because I have to set FPS manually to 30 while it should be a number like 29.xx. I tried to add timestamps to the buffer however didn’t work. I uploaded the main.cpp file. Someone can please help me? main.cpp (32.5 KB)
Hi,
In raw h264/h265 stream , there is no timestamp information for each buffer. It is put as fps in VUI. So for having accurate timestamp for each frame, you would need to mux the compressed frames into container such as mp4 or mkv. Like the gstreamer command:
Thanks for your reply. I’m using 2 cameras and the gstreamer pipeline outputs are not synchronized. So I have to use libargus and NvVideoEncoder. Normally adding timestamps should be possible because NvVideoEncoder is using v4l2_buffer and there is a timestamp variable in that structure.
As you can see in my main.cpp file I’m setting timestamp values of buffers(I’m still not sure the calculation of timestamp is correct).
That print is already there. And it is printing automatically generated values calculated like
1/framerate * number of frames * (second unit)
I don’t know where it is calculated however it should be set from timestamps that are coming from cameras. Even if I set the buffer timestamp in that callback function, the result doesn’t change.
Still you would need to implement mux code to generate mp4 directly. If you save h264 stream and mux it through ffmpeg, it doesn’t have timestamp for each encoded frame.
Setting that timestamps are not working. I solved the issue by adding a gstreamer pipeline after the encoder. I’m pushing encoded packets to gstreamer pipeline with appsrc and adding timestamps to that packets. So topic can be closed now. Thanks.