Hi, I’d like to be able to retrieve accurate timestamps (ideally in nanoseconds) for each frame that is captured from the camera and consumed in my application. This would help significantly for performing certain tasks in computer vision. Currently I am capturing frames from the camera using an OpenCV VideoCapture object with a GStreamer pipeline (as shown below).
nvarguscamerasrc do-timestamp=true silent=true sensor-id=0 sensor-mode=0 wbmode=1 saturation=0.0 tnr-mode=1 tnr-strength=0.25 gainrange="1.0 1.0" ispdigitalgainrange="1.0 1.0" ee-mode=0 ee-strength=0.0 ! video/x-raw(memory:NVMM), stream-format=(string)NV12, width=4056, height=3040, framerate=15/1 ! queue leaky=2 max-size-buffers=1 ! nvvidconv ! video/x-raw(memory:NVMM), stream-format=(string)NV12, width=2028, height=1520, framerate=15/1 ! nvvidconv ! video/x-raw, format=(string)BGRx ! appsink drop=true
I am not expecting my application to consume every frame from the camera, it won’t run the algorithms I am using that quickly on the Nano, so dropping frames is totally OK (hence the leaky queue and appsink drop=true parameters) however I do need to know precisely when each frame that is consumed was captured.
As far as I have been able to tell so far, getting the CAP_PROP_FRAME_POS_MSEC is the only way to get a timestamp from this pipeline. However, from inspecting the timestamps returned, they are very inaccurate with lots of variability. Digging deeper into what this is returning it appears that this is not the actual frame capture timestamp, but the timestamp of when the frame was added to the appsink buffer.
Is there a way that I can retrieve the timestamp of when the frame was actually exposed?
Thanks!