|•|Hardware Platform (Jetson / GPU)|. Jetson Xavier 8GB, Xavier NX and TX2
|•|DeepStream Version| 5DP
|•|JetPack Version (valid for Jetson only)| 4.4DP
|•|TensorRT Version| 7.1
I am looking for the way to achieve the lowest latency possible. I am current at 110-130ms for all platforms (Xavier, NX, TX2), this is via USB webcam and nearly identical for CSI camera as well. This is using the sample apps.
Using Deepstream with Python bindings the latency is generally 170-190ms.
I would like to know where the latency is coming from so I can try and reduce it. I found this thread: Latency measurement issue
If this is the best method, it would be great to have a little more detailed steps to get the info. Also, how would I get that info using deepstream with the python bindings?
Thanks!
I am testing latency time with two methods, recording a timer with the camera and then taking a photo of both screens. Also, recording the camera and display with a high speed camera (1000fps) then counting the frames it takes to see movement on the screen after moving the camera.