Nvarguscamerasrc source time of jpeg saved with multifilesync

Hi,
I have a pipeline that can be (simplified) represented as
nvarguscamerasrc → capsfilter → nvvidconv → nvjpegenc → framerate → capsfilter → multifilesink

What I want is the kernel time when the image was captured, in order to synchronize with other sensors running independently.

Currently, to timestamp the photos, I have a gst_bus_add_watch(bus, bus_call, this); When the bus call receives a message with type GST_MESSAGE_ELEMENT and name GstMultiFileSink, for the kernel time, I call

struct timeval tp;
gettimeofday(&tp, NULL);
double systemMilliseconds = tp.tv_sec * 1000 + tp.tv_usec / 1000;

But this is just telling me the kernel time after the image has already been saved, with some extra latency.
I also call

gst_structure_get_clock_time(s, “timestamp”, &timestamp);
gst_structure_get_clock_time(s, “stream-time”, &stream_time);

but I think the values from these are relative times, so I still need to correlate them to kernel time at image capture.

I see there are a number of posts related to timing (such as probing the source pad buffer and creating a metadata quark), however I haven’t figured out how to apply these to my scenario, especially given specifics like having reduced the framerate.

Can you help explain what I can do to get the kernel time when the frames saved by multifilesink were captured by nvarguscamerasrc? Thanks,

Hi,
You may refer to this post to get sensor timestamp of each frame:
Nvarguscamerasrc Buffer Metadata is missing - #29 by DaneLLL
Time stamping the image - #5 by DaneLLL

One possible solution is to overwrite buf_pts with sensor timestamp and rebuild nvarguscamerasrc. So that you can get the information in multifilesink.

Hi,
I had already looked at the first link. However, it looked like it was getting metadata for all data coming out of the source, and I’m not sure how to correlate that the the frames being saved by the multifilesink. Also, I found that when I using the probe as written in that post, the process dies. I haven’t looked in detail why.

In my latest attempt, after creating the pipeline element, I

pipeline_ = gst_pipeline_new (“camerapipe”);
GstClock *systemclock = gst_system_clock_obtain();
g_object_set (G_OBJECT (systemclock), “clock-type”, GST_CLOCK_TYPE_REALTIME , NULL);
gst_pipeline_use_clock((GstPipeline*)pipeline_, systemclock);

Later,

base_time = gst_element_get_base_time(pipeline_);

Then, when monitoring the bus,

switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ELEMENT: {
const GstStructure* s = gst_message_get_structure(msg);
if (gst_message_has_name(msg,“GstMultiFileSink”)) {
GstClockTime timestamp;
gst_structure_get_clock_time(s, “timestamp”, &timestamp);
struct timeval tp;
gettimeofday(&tp, NULL);
long int systemMicroseconds = tp.tv_sec * 1000000 + tp.tv_usec;

Then I print the systemMicroseconds as well as collectedtime=timestamp+base_time. The collectedtime is generally 20-100 ms lower than systemMicroseconds, which seems plausible. (data was collected 20-100ms before the bus watch receives the message from multifilesink.) Does this method appear correct to you, or do you see errors in my understanding of the clocks etc?

I’ll look at the nvarguscamerasrc code, but I’m nervous about modifying the internals of elements. (also becomes more difficult to maintain).

Hi.
In gstreamer fromeworks, it generates timestamps in userspace so it may not be applicable in your use-case. Would suggest you check nvarguscamerasrc plugin and do customization.

You may also try jetson_multimedia_api. We have sample for Argus + JPEG encoding:

/usr/src/jetson_multimedia_api/samples/09_camera_jpeg_capture

Hi,
I realized why the probe attempt earlier caused my process to die - I hadn’t copied your new compiled .so file to the right place. Now I’ve done that, I’m able to use the probe, and print out the Frame# and Timestamp.

Can you tell me what are the units / reference point of the timestamp value? (I believe it’s nanoseconds, but that means timestamps such as 10308457294000 equals 2 hr 51 m 48.45 s …what is time 0?)

I found that having the probe on the nvarguscamerasrc src allows me to obtain the metadata quark, but after the framerate element, this is no longer available. I think I can deal with this though, perhaps by putting the timestamp into a gstmeta added to the buffer, or by correlating with other data that persists from the start of the pipeline to the end.

We’re using 32.2.1 and the nvarguscamerasrc is not open source for that version, so I don’t think we can customize it at this time.

P.S. The jetson multimedia api example looks good and we’ll probably explore that in the future, however I think it will be a large learning curve compared to using the gstreamer pipeline if we can make that work.

I was able to propagate the quark meta capture timestamp to the multifilesink.

From what I’ve seen, this timestamp is based on an RTCPU clock, and zero is simply when the system booted. But I’m still unsure how to relate this to real time. There was mention of /sys/devices/system/clocksource/clocksource0/offset_ns but it was unclear how this helps.

So, how to convert this capture timestamp to realtime is the remaining question…

Hi,
Have a reference to below topic for the timestamp transfer to kernel time and map to real time by yourself.

So the simple answer is that it can not be done within the gstreamer framework, it must be done by directly using the argus library, correct?

Yes, that’s correct.