Hi,
I have a pipeline that can be (simplified) represented as
nvarguscamerasrc → capsfilter → nvvidconv → nvjpegenc → framerate → capsfilter → multifilesink
What I want is the kernel time when the image was captured, in order to synchronize with other sensors running independently.
Currently, to timestamp the photos, I have a gst_bus_add_watch(bus, bus_call, this); When the bus call receives a message with type GST_MESSAGE_ELEMENT and name GstMultiFileSink, for the kernel time, I call
struct timeval tp;
gettimeofday(&tp, NULL);
double systemMilliseconds = tp.tv_sec * 1000 + tp.tv_usec / 1000;
But this is just telling me the kernel time after the image has already been saved, with some extra latency.
I also call
gst_structure_get_clock_time(s, “timestamp”, ×tamp);
gst_structure_get_clock_time(s, “stream-time”, &stream_time);
but I think the values from these are relative times, so I still need to correlate them to kernel time at image capture.
I see there are a number of posts related to timing (such as probing the source pad buffer and creating a metadata quark), however I haven’t figured out how to apply these to my scenario, especially given specifics like having reduced the framerate.
Can you help explain what I can do to get the kernel time when the frames saved by multifilesink were captured by nvarguscamerasrc? Thanks,