Deepstream 6.0: Image capture to muxer large latency

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Xavier AGX
• DeepStream Version 6.0
• JetPack Version (valid for Jetson only) 4.6
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I’m running the standard deepstream 6.0 yolov3 (plus patch) in which I save the files to output folder. But I’m using a logitech c920 usb camera at 30 fps as input. DS6 reports the fps as 30 fps, as expected.
The camera is focussed on the Xavier screen which has a terminal just showing the system timestamp. (I’m using an AMOLED display for minimal onscreen latency)
I’m comparing the timestamp seen in the saved image to that of the ‘muxer in’ timestamp reported from NV_ENABLE_COMPONENT_LATENCY=1.
This shows a huge latency (around 200ms) for image capture and decoding. I’ve done all the speed up things like jetson clocks, max power mode etc.
Any ideas why the decode latency is so high, or is it a problem with the transfer into the buffer from the camera?

Sorry for the late response, is this still an issue to support? Thanks

Yes if you could help that would be great

could you share the log?


Its an empirical observation. The difference between the image timestamp displayed in the image vs the muxer timestamp from NV_ENABLE_COMPONENT_LATENCY=1.

If there’s a way to get the image taken timestamp directly into the DS6 code, then that would be very useful!

what plugin do you use to capture the camera data? I think it’s v4l2src plugin, but need your double check.
If it’s, v4l2src is a 3rd party plugin, it does not support to attach the capure timestamp on the capture data, but you can also refer to DeepStream SDK FAQ - #12 by bcao to add prob on v4l2src sink pad to set its timestamp.

could you share the log and how come out this conclusion?

Thanks. Can you tell me exactly what to do, I didn’t understand the link… What will take you 15mins will probably take me 3 days.

Ok, before taking effort to implement the code, let’s back to above description firstly to roughly check the latency log with NV_ENABLE_COMPONENT_LATENCY=1 , can you please share the log and how you get this conclusion? Since you alreay have this, I don’t think it will take your much time to share them.

Plese see below the component latencies for frame number 400 (as an example):

BATCH-NUM = 400**
Comp name = nvstreammux-src_bin_muxer source_id = 0 pad_index = 0 frame_num = 0 in_system_timestamp = 1643203026630.641113 out_system_timestamp = 1643203026630.898926 component_latency = 0.257812
Comp name = primary_gie in_system_timestamp = 1643203026630.968018 out_system_timestamp = 1643203026689.865967 component latency= 58.897949
Comp name = tiled_display_tiler in_system_timestamp = 1643203026696.002930 out_system_timestamp = 1643203026698.981934 component latency= 2.979004
Comp name = osd_conv in_system_timestamp = 1643203026699.482910 out_system_timestamp = 1643203026700.868896 component latency= 1.385986
Comp name = nvosd0 in_system_timestamp = 1643203026700.947998 out_system_timestamp = 1643203026700.960938 component latency= 0.012939

The actual frame captured by the camera is below. As I explained the camera is just taking an image of the system timestamp.

image1643203026.692585836_400a - Copy

Assuming zero screen latency, the camera shows the image was taken at timestamp 1643203026.449980706 (last figure in image). The
results above show the muxer in timestamp to be in_system_timestamp = 1643203026630.641113, so there’s a latency of 0.180660407s (1643203026.630641113-1643203026.449980706) or 181 ms between the image capture and going into the muxer.

Is there a) a way to reduce this latency? (I’ve done all the usual things e.g. jetsonclocks)
b) to get the timestamp when the camera takes the images into the DS6 code somehow, so that I can calculate this (~180ms) latency for each frame.


Hi any update on this so far?

discussed your issue internally?
One suggestion is: could you remove “primary_gie” from the pipeline and check again the latebcy ?
We suspect the long processing time after nvstreammux blocked the nvstreammux to return/query the frame and cause the long delay.

And, what’s the pipeline before nvstreammux ?

It’s just v4l2src as I’m using a usb camera. I’ll investigate the latency as you said. I really need to get the timestamp when deepstream requests the image. Any ideas?

v4l2src is GST open source plugin, per the call stack below, I think you can check the timestamp below.


→ gst_v4l2_allocator_dqbuf()

→ attached a timestamp on the GST buffer in gst_v4l2_buffer_pool_dqbuf() (here)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.