Streaming video latency


We’re porting an existing system from a Snapdragon board to the Jetson and running into issues with video latency. I’m essentially taking the Argus sample 10 that writes an h264 stream to a file and instead of sending the stream to a file, I’m packaging it up and sending over rtp. I did change the encoder in the sample to use the baseline profile and level 3 (our stream is 640x480), so I’m pretty confident there are no b-frames involved. The encoder is reporting latency of around 35 ms on average, and each NAL gets sent in about 500 microseconds in the dqueue thread, so I’m at a bit of a loss as to where the latency is coming from. The tranfer of video memory to user space, perhaps? The end-to-end latency with my code is 530 ms. Using gstreamer as the source but running the same decoder at the other end, the latency is 330 ms, but we’d like to use Argus, for a variety of reasons, and we’d like to get closer to 200 ms.

Any ideas?

A related question: What is the starting time for the timestamp field in the v4l_buf? It would be helpful to know when the frame was captured relative to the system clock.


Hi r.jameson,
Please share a patch on 10_camera_recording so that we can check how you profile the latency and do further check.

Hi DaneLL,

Sorry for the slow response. In answer to your question, I’m looking at the whole latency from the camera, through the encoder, through a network connection, to the decoder and display on a remote client. I’m taking a picture of a screen showing a stop watch application along side video of the same to get a rough estimate of total latency. I have a setup that uses a laser diode, photo detector, and oscilloscope to do that measurement, but it’s a bit of a pain to set up.

After some more characterization, I figured out that the time from capture to network transmission was generally under 100 ms, which, given one frame delay for capture and one frame delay for compression, isn’t too terrible. Something in the h264 settings encoder settings was causing my decoder to buffer frames. The solution for me was


on the video encoder. This brings my latency down to between 240 and 330 ms (it appears to be extremely variable).

Two follow-on questions if you have time:

  1. Is there any way to have the frame time of the encoded video frame reflect the capture time of the frame? The timestamps of the encoded frames seem to start at the moment the encoder was started; and
  2. Are there recommendations for lowest latency H264 live streaming using the NVidia encoder? I think I've covered the basics (baseline profile, insertVUI), but the total latency is still about 100 ms longer than I'd like.

Hi r.jameson,

Please refer to 01_video_encode in r28.2. you can configure timestamp of input frames via ‘–copy-timestamp’

We have checked capture latency

But do not check capture + encoding. Please share a patch on 10_camera_recording so that we can check it with default camera ov5693.

Hi DaneLLL,

Having solved the problem to some degree, I hadn’t found time until now to create the patch you asked for. Here it is:!AjqS0Pe9SwR2hnQR97zvusO5x9JM I’m seeing latency from camera to encoder output of 40 ms, which is really good. So my latency is happening somewhere between the encoder and the screen (payloader, IP stack, IP stack, depayloader, decoder, display).

Hi r.jameson

Glad to hear this. I saw the in the 1st post you wrote that the resolution is 640x480, did you test higher resolution @60 fps ?

Thanks !