Hello everyone,
I am spending hours and hours trying to find a solution for my problem. I hope you can give me a hint to finally solve it.
I have the following configuration:
- Nvidia AGX Orin Developer Kit 64 Gb
- Deserializer board with 6x full HD cameras via GSML2
- All cameras are synchronized to 30 FPS
- All cameras are accessible via /dev/video*
- Jetpack 5.1.1
Problem
I need to get the image timestamps in epoch time for each frame. Ideally the timestamp at which the image was taken, or the earliest timestamp possible.
I tried the following approaches
Approach 1
Using OpenCV and GStreamer’s appsink to use video-writing and simply reading the epoch time before video-writing. To get the data in cv::Mat, I use mainly "v4l2src device=$(arg DEVICE) ! video/x-raw ! nvvidconv flip-method=0 ! video/x-raw,width=(int)$(arg IMG_WIDTH), height=(int)$(arg IMG_HEIGHT), format=(string)BGRx ! videoconvert ! appsink drop=1"
However, this solution is not accurate enough since the framerate of the cameras differs from time to time from 29.x to 30.x FPS. Thus, I have identified a mismatch between the number of timestamps written in the text file and the number of frames in the video. A similiar source can be found in this stackoverflow topic.
Appraoch 2
Using FFmpeg with the idea to write the epoch time of the start as meta-data and record the time since start. At the end, I am merging everything together.
I am recording the video data per: ffmpeg -f v4l2 -i /dev/video0 -c:v libx264 -preset ultrafast -use_wallclock_as_timestamps 1 -fflags +genpts -metadata date="${EPOCHREALTIME/[^0-9]/}" video0.mp4>/dev/null
One solution can be found in my Repo.
This solution works OK. However, due to the used FFmpeg, the code does not use HW-acceleration.
Therefore my specific question
Is it possible to record the video from /dev/video0
… /dev/video5
using GStreamer and HW-acceleration while keeping the timing information of every image?