Can the time-stamp be saved for each frame of the video?
I use video.sh
gst-launch-1.0 v4l2src device=/dev/video0 ! “video/x-raw,format=UYVY,width=1920,height=1080,framerate=15” ! nvvidconv ! “video/x-raw(memory:NVMM),format=NV12” ! nvoverlaysink sync=false
The timestamps are stored in pts of GstBuffer. If you need the data, you can link to appsink and get pts of every frame.
Can the Xavier-NX system time be saved for each frame of the video?Thanks
Here is a sample of using appsink:
Gstreamer decode live video stream with the delay difference between gst-launch-1.0 command and appsink callback - #6 by DaneLLL
You may apply your pipeline and give it a try. Generally it generates timestamps in gstreamer framework. If you need the timestamps in kernel, maybe you can try jetson_multimedia_api.
12_camera_v4l2_cuda is the sample for capturing frames through v4l2.
Can timestamps be converted into system time？I need to sytem time be saved for each frame of the video，Thanks
For information, do you use a YUV sensor connecting to CSI port on Xavier NX? Or a USB camera?
We use a YUV sensor connecting to CSI port on Xavier NX
Please reference to below topic for the timestamp transfer to kernel time the transfer to system time by your self.
Hi, Team: In vi5_fops.c, I modifyed line 422 like this: /* Read SOF from capture descriptor */ ts = ns_to_timespec((s64)descr->status.sof_timestamp); get_monotonic_boottime(&tsa); printk("sec: %ld,nsec: %ld\n camerta sec: %ld,nsec:...