How to get the videos/streams timestamp

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 6.2
• TensorRT Version 8.5.2.2
• NVIDIA GPU Driver Version (valid for GPU only) 530.30.02

Hello,

I have developed a deepstream pipeline (python) that accepts multiple sources (streams/videos). However each video/stream has different frame rates and I need to know the real timestamp of each sources since each of them has different frame rate. How can I get the real timestamp of each video/streams?

For example:
If I load multiple videos (video1: 10FPS, video2: 25FPS, video3:59FPS), the deepstream pipeline will run all the videos at the highest FPS which is 59 but not exactly same for all the videos. video1: 58.7, video2: 58.4, video3:59. On the other hand, if I load multiple streams(stream1: 5FPS, stream2: 2FPS, stream3: 30FPS), the deepstream pipeline will run all the streams at the lowest FPS which is 2 FPS, not exactly the same so the timestamp Im getting is not going to be correct! How can I get the timestamp of each video/stream based on their deepstream frame rate??

The frame rate is influenced by many factors. You can refer to the link below first to set the appropriate frame rate related parameters.
https://forums.developer.nvidia.com/t/deepstream-sdk-faq/80236/34

Thanks @yuweiw but I already configured the new streammux to get maximum performance, and all good. My question is how do I get the actual timestamp of each stream or video, not how to configure the streammux

OK. There is a parameter in the structure of NvDsFrameMeta: ntp_timestamp. You can try to get the parameter and see if it meets your needs.

I already tried ntp_timestamp but the problem is that ntp_timestamp returns the same value even though each stream has a different FPS and also tried to get my system time (attach-sys-ts) but the issue is the streams are running in different frame rates and Im getting the same timestamp even though it shouldn’t be the same since the FPS is different

In which plugin did you get this value? What specific timestamp value do you want to get, the original timestamp of each frame or the timestamp set by nvstreammux? The nvstreammux can set timestamp to the frame. https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvstreammux.html

We are getting the value at the tiler level. Ideally, we would like to get the original timestamp of each frame. If we can’t have that value and we use the nvstreammux timestamp (attach-sys-ts = True), then we have an issue when the sources have different FPS.

Suppose that Source 1 (live camera) records at 10 FPS and Source 2 (live camera) records at 30 FPS. The Deepstream pipeline goes at the lowest frame rate so it’s at 10 FPS. If we get the timestamp, we see that both have the same value, but source 2 is not going real-time since it has a higher FPS than the source1, so the timestamp that we receive is not “accurate”, while for source 1 it is correct as it’s being processed in real-time. Maybe it’s worth mentioning that we have queues between our plugins, so it might not be the best idea for this kind of scenario.

In case we can’t get the original timestamp of each source, how could we get the same information using Streammux if sources have different FPS? Is there a way to process both at their original frame-rate levels (we are using the new nvstreammux)?

They have merged into a single image in the tiler plugin. So the timestamp is the same.

No. The Gst-nvstreammux forms a batch of frames from multiple input sources. It needs some waiting strategies to form a batch.
Could you provide a runnable minimized demo to illustrate your problem? We can run that in our environment and analyze this scenario.

Okay to simplify, Look at this video pls. I have connected 2 usb-cameras directly to my DS app. The left part is a timer, and the middle (50FPS camera) and right (30FPS) screens are the camera’s output. As you can see, the time I’m getting is almost the same, but it’s not exactly the same (there are differences in milliseconds). For example, the timer is showing 00:09:41:345 but what Im getting for the middle camera is 00:09:41:875 and 00.09:41:932 for the right one. In my application, it is essential to get the original timestamp, which I can do, but it’s not as accurate as it has to be. My question is, how do I get this original timestamp?

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

OK. So the middle (50FPS camera) and right (30FPS) screens are the camera’s original timestamp, and the left screen is the output timestamp? Could you attach your minimized source code to us? We can check the whole pipeline and how you get the timestamp.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.