I am using deepstream for detection and tracking. I am able to extract required metadata like object coordinates but not able to extract timestamps associated with each frame. I tried out buf_pts and that field shows garbage value. Does my input video require some specific encoding setting. Right now my video has created time, PTS and DTS values when I test with ffprobe command.
Please add below code and check if you can get valid pts:
static GstClockTime prevPts = 0; g_print("PTS %lu diff %lu (in nano second) \n", buf->pts, buf->pts - prevPts); prevPts= buf->pts;
We add it in tracking_done_buf_prob() in deepstream_app.c and can see valid values( diff 33ms in playing 30fps video )
Could you please add the full source code for this? I’m working on a multi-IP camera solution and need to sync the frames. Would be great if I could extract timestamp associated with every frame. It is not clear to me how to use the above code.
tracking_done_buf_prob() is renamed to analytics_done_buf_prob() in DS4.0.1.
It simply works if you add the lines into the source code and rebuild deepstream-app.
Thanks @DaneLLL. I managed to get it to print the timestamp. However I noticed it prints a combined timestamp from all camera sources (I’m using a modified version of the test3 app). I’d like to do it for individual sources and sync the cameras using the timestamps. Do you by any chance have experience with anything like that?
For multiple sources, you should use frame_meta->buf_pts. I add the print in tiler_src_pad_buffer_probe() in test3:
g_print ("Frame Number = %d, <b>pts %ld</b>, Number of objects = %d " "Vehicle Count = %d Person Count = %d\n", frame_meta->frame_num, <b>frame_meta->buf_pts</b>, num_rects, vehicle_count, person_count);
and can see correct values. However, it is reported to be garbage values in #1. Cold you check if you see correct value in your setting?
Hey @DaneLLL, got it! It’s working perfectly, showing the timestamps for individual frames. Now I just need to write a sync function to extract the right frame to a buffer. I also managed to build a NTP server so the cameras ping it – seems to be helping. I’ll update here if I require more information. Have you by any chance written code to sync frames?
We don’t have sample code for sync frames.
Hi @DaneLLL .
I work with USB- GenICam cameras .I want to get correct timestamp of each frame. But by calculating
buf->pts for each frame, I get difference time 33ms between two successive frames, and I know it’s impossible. It means that we don’t have any frame loss!!! Because my project could not run on 33 frame per second. Also I think industrial cameras could gives correct time of each frame.
But how could I get correct timestamp for each frame in GenICam cameras in deepstream?
Please help to open a new topic for your issue. Thanks