Smart Record timestamp

Please provide complete information as applicable to your setup.

• Hardware Platform: Jetson
• DeepStream Version: 6.3
• JetPack Version: 5.1.2
• TensorRT Version: 8.5.2
• NVIDIA GPU Driver Version: R35

I have this pipeline:

nvmultiurisrcbin smart-record=1 ! nvinfer ! nvmsgconv ! nvmsgbroker

My application detects objects and sends the detections to the cloud
For recording videos with detections, I also use a Smart Record mechanism:
In order to synchronise detection messages and video frames, I need to build an application.

In Smart_record, I don’t know what exactly the start time is (in milliseconds, not seconds).
For example: Smart_Record_1_00000_20241001-080218_1.mp4 not have milliseconds

Even if I know the exact first frame timestamp, how can I determine the next frames timestamps? Can I assume that the Smart record has a constant frame rate?

Thanks

Refer this api documenation

https://docs.nvidia.com/metropolis/deepstream/dev-guide/sdk-api/group__gstreamer__nvdssr.html#gace3f9e2eb2800da76f6ffa37e6ee25a6

Smart recording dumps the input stream directly, which means that the frame rate and timestamp of the output file are determined by the input.

In the API documentation, the start time parameter is only seconds, but I need to know the exact start time of the smart record (in miliseconds).

Smart recording cannot provide such precise start time.

If you need to record the start time in milliseconds, you need to record the pts of each GstBuffer.
Can you share the reason for doing this?

For each frame, I detect specific objects and save the detections (with the ntp_timestamp in my database).
For example, {ts:1730043469930115778 , rect:{x1:1,x2:1,y1:2,y2:2}}}

Additionally, I save the smart record video of the detections (starting when I detect the object for the first time)

In order to check the accuracy of the detection model, I have to be able to show my detection on each frame, so I take the video and try to find the correct ntp_timestamp for each frame.

Using the smart record video, it’s impossible for me to determine the exact timestamp for each frame

It is possible to save a video with filesink or something, but I have two streams to add with the REST API, and handling them will be complex.

I think the following method can work, I use nvurisrcbin as the soure element

1.Configurate the pipeline clock to ntp clock

from gi.repository import GLib, Gst, GstNet

pipeline = Gst.Pipeline()
 
 # Create the NTP clock
 clock = GstNet.NtpClock.new("ntp-clock", "time1.google.com", 123, 0)

 # Set the pipeline to use the NTP clock
pipeline.use_clock(clock)

Now the PTS of GstBuffer is ntp_timestamp.

  1. Use mkv as a saving container.
uri_decode_bin.set_property("smart-rec-container", 1)

In mkv, the timestamp of rtp can be preserved.such as

 <packet codec_type="video" stream_index="0" pts="3939093403552" pts_time="3939093403.552000" dts="3939093403552" dts_time="3939093403.552000" duration="66" duration_time="0.066000" size="51850" pos="517" flags="K_"/>
 <packet codec_type="video" stream_index="0" pts="3939093403585" pts_time="3939093403.585000" dts="3939093403585" dts_time="3939093403.585000" duration="66" duration_time="0.066000" size="21775" pos="52375" flags="__"/>

3.Find the desired frame from the mkv file.

It’s solved my problem.
Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.