Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) Xavier AGX • DeepStream Version 6.0 • JetPack Version (valid for Jetson only) 4.6 • TensorRT Version • NVIDIA GPU Driver Version (valid for GPU only) • Issue Type( questions, new requirements, bugs) • How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) • Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
Hi
I’m running the standard deepstream 6.0 yolov3 (plus patch) in which I save the files to output folder. But I’m using a logitech c920 usb camera at 30 fps as input. DS6 reports the fps as 30 fps, as expected.
The camera is focussed on the Xavier screen which has a terminal just showing the system timestamp. (I’m using an AMOLED display for minimal onscreen latency)
I’m comparing the timestamp seen in the saved image to that of the ‘muxer in’ timestamp reported from NV_ENABLE_COMPONENT_LATENCY=1.
This shows a huge latency (around 200ms) for image capture and decoding. I’ve done all the speed up things like jetson clocks, max power mode etc.
Any ideas why the decode latency is so high, or is it a problem with the transfer into the buffer from the camera?
Thanks
Paul
Its an empirical observation. The difference between the image timestamp displayed in the image vs the muxer timestamp from NV_ENABLE_COMPONENT_LATENCY=1.
If there’s a way to get the image taken timestamp directly into the DS6 code, then that would be very useful!
what plugin do you use to capture the camera data? I think it’s v4l2src plugin, but need your double check.
If it’s, v4l2src is a 3rd party plugin, it does not support to attach the capure timestamp on the capture data, but you can also refer to DeepStream SDK FAQ - #12 by bcao to add prob on v4l2src sink pad to set its timestamp.
could you share the log and how come out this conclusion?
Ok, before taking effort to implement the code, let’s back to above description firstly to roughly check the latency log with NV_ENABLE_COMPONENT_LATENCY=1 , can you please share the log and how you get this conclusion? Since you alreay have this, I don’t think it will take your much time to share them.
The actual frame captured by the camera is below. As I explained the camera is just taking an image of the system timestamp.
Assuming zero screen latency, the camera shows the image was taken at timestamp 1643203026.449980706 (last figure in image). The
results above show the muxer in timestamp to be in_system_timestamp = 1643203026630.641113, so there’s a latency of 0.180660407s (1643203026.630641113-1643203026.449980706) or 181 ms between the image capture and going into the muxer.
Is there a) a way to reduce this latency? (I’ve done all the usual things e.g. jetsonclocks)
b) to get the timestamp when the camera takes the images into the DS6 code somehow, so that I can calculate this (~180ms) latency for each frame.
discussed your issue internally?
One suggestion is: could you remove “primary_gie” from the pipeline and check again the latebcy ?
We suspect the long processing time after nvstreammux blocked the nvstreammux to return/query the frame and cause the long delay.
It’s just v4l2src as I’m using a usb camera. I’ll investigate the latency as you said. I really need to get the timestamp when deepstream requests the image. Any ideas?