Which video streaming method over WIFI has the best video quality

In my application, I am using OpenCV VideoCature to launch the gstreamer script below to stream the video over WIFI to a Windows PC and do image processing on Jetson Nano frame by frame:

v4l2src device=/dev/video0 \
    ! video/x-raw, format=UYVY, width=(int)2592, height=(int)1944, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)24/1 \
    ! nvvidconv ! video/x-raw(memory:NVMM), format=I420 \
    ! tee name=camNVMM ! queue ! nvv4l2h264enc maxperf-enable=true insert-vui=true insert-sps-pps=1 bitrate=10000000 \
    ! tee name=h264_stream ! queue ! h264parse ! rtph264pay ! udpsink clients= \
    h264_stream. ! queue ! h264parse ! matroskamux ! filesink location=test_h264.mkv \
    camNVMM. ! queue ! nvvidconv ! video/x-raw, format=GRAY8, width=640,height=480 ! appsink

The above code works well in most of time, but the PC received frame data has been damaged sometimes during the transmission using UDP protocol.
I tried to use TCP protocol, it has no frame data damaged, but I have noticed it may have number of frame dropped sometimes and the latency is more than 1.5 seconds.
All the cases (frame data damaged, frame dropped, big latency) are not acceptability.

Therefore, I am looking for other method to meet my application requirements:

  1. Jetson nano do image processing frame by frame. The frame size is 640 x 480.
  2. Along with image processing, Jetson nano stream the video to Windows PC over WIFI at full frame size (2592 x 1944), 24 fps without frame data damaged and frame dropped.
  3. The latency is less than 500 ms

I don’t know which method can do the job?

  1. DeepStream?
  2. webRTC?
  3. or something else?

Would you please give me your suggestion.
Thank you very much.


Please refer to below threads to see if can gain some ideas:
Jetson nano + deepstream + 4 .I.P cams + 2 Webcams + RTSP streaming - Jetson & Embedded Systems / Jetson Nano - NVIDIA Developer Forums
RTSP Streaming via OpenCV - Jetson & Embedded Systems / Jetson Nano - NVIDIA Developer Forums
Trying to get ultra low live-streaming latency(<100ms) on the drone using nano - Jetson & Embedded Systems / Jetson Nano - NVIDIA Developer Forums

For video quality, please run in CBR and set virtual buffer size. This should bring balance between quality and bitrate. Please check this example of setting the properties:
Random blockiness in the picture RTSP server-client -Jetson TX2 - #5 by DaneLLL

Thanks you for your message. I am going to try it

I tested using CBR at 10mbps and set vbv-szie to 625000, MeasureEncoderLatency=1.
This is my test script

v4l2src device=/dev/video0 ! video/x-raw, format=UYVY, width=(int)2592, height=(int)1944, framerate=(fraction)24/1 ! nvvidconv ! video/x-raw(memory:NVMM), format=I420  ! nvv4l2h264enc preset-level=1 MeasureEncoderLatency=1 maxperf-enable=true insert-vui=true insert-sps-pps=1 bitrate=10000000 control-rate=1 vbv-size=625000 ! h264parse ! rtph264pay ! udpsink clients=

This is the output encode latency:

I don’t care the first number of frames taking more than 400 ms to encode one frame. But I need to know why and how to resolve the issues of encode one frame takes more than 300ms in rest frames (frame 1092 to 1094, and frame 4463 to 4465).

Below is the log data for the frames encode latency more than 300ms:

KPI: v4l2: frameNumber= 1090 encoder= 12 ms pts= 46947206264

KPI: v4l2: frameNumber= 1091 encoder= 17 ms pts= 46988884785

KPI: v4l2: frameNumber= 1092 encoder= 403 ms pts= 47030550421

KPI: v4l2: frameNumber= 1093 encoder= 368 ms pts= 47072215420

KPI: v4l2: frameNumber= 1094 encoder= 323 ms pts= 47113893889

KPI: v4l2: frameNumber= 1095 encoder= 11 ms pts= 47155555525

KPI: v4l2: frameNumber= 1096 encoder= 15 ms pts= 47197221785

KPI: v4l2: frameNumber= 1097 encoder= 12 ms pts= 47238895369
KPI: v4l2: frameNumber= 4461 encoder= 11 ms pts= 187621020785

KPI: v4l2: frameNumber= 4462 encoder= 18 ms pts= 187662694785

KPI: v4l2: frameNumber= 4463 encoder= 596 ms pts= 187704361785

KPI: v4l2: frameNumber= 4464 encoder= 560 ms pts= 187746031212

KPI: v4l2: frameNumber= 4465 encoder= 518 ms pts= 187787696265

KPI: v4l2: frameNumber= 4466 encoder= 17 ms pts= 187829374577

KPI: v4l2: frameNumber= 4467 encoder= 13 ms pts= 187871030525

The pipeline looks optimal. Youay run with sudo nvpmodel -m 0 and sudo jetson_clocks. If the issue is still present, it looks to be constraint of Jetson Nano. It can achieve 30 fps,but occasionally the system is too busy, leading to the delay.

I just run with sudo nvpmodel -m 0 and sudo jetson_clocks, and re-run my test. It still has some frame encoding latency more 550ms. Please refer to the log below:

KPI: v4l2: frameNumber= 1728 encoder= 13 ms pts= 72444785621

KPI: v4l2: frameNumber= 1729 encoder= 10 ms pts= 72486445350

KPI: v4l2: frameNumber= 1730 encoder= 641 ms pts= 72528114141

KPI: v4l2: frameNumber= 1731 encoder= 603 ms pts= 72569791193

KPI: v4l2: frameNumber= 1732 encoder= 556 ms pts= 72611448194

KPI: v4l2: frameNumber= 1733 encoder= 16 ms pts= 72653118037

KPI: v4l2: frameNumber= 1734 encoder= 11 ms pts= 72694798715

It means that this is known issue and can’t be sort out, is it?


Yes. It can be best performance we can achieve on Jetson Nano.

I did another test with a static target in front of the camera.
I Just run video stream from Jetson nano without other operation except Ubuntu OS related process. The spikes of encoding latency (169ms) are still detected. Can I say the nvv4l2h264enc is not a purely hardware encoder, i.e, it has some CPU related operation. Therefore, the Ubuntu OS process may affect the encoder latency, can I?

The encoding is done on hardware engine, but some stacks are on CPU, like sending frames to hardware encoder, getting encoded h264 stream from encoder, and passing to upper application layer. So if you run encoding in high fps or multi-encoding instances, there is certain CPU usage.

Comparing to jetson_multimedia_api, gstreamer is with more layers and have more latency.

It means that jetson_multimedia_api can stream the video with the same video quality and less latency than gstreamer, is it?

If yes, can jetson_multimedia_api do the job as my script below and get grey video frame 640 x 480 for image processing in OpenCV?


The jetson_multimedia_api is low-level APIs and there are samples for demonstrating hardware functions. For high-level software stacks such as muxing into mp4, mkv, or UDP/RTSP streaming, you would need to do the implementation. It is tradeoff between using jetson_multimedia_api or gstreamer. You can construct gstreamer pipeline with existing plugins fast. For reducing latency, can use jetson_multimedia_api but need to implement SW stacks from head to tail.

Thank you very much

Is the jetson_multimedia_api included in the SDK? I tried a quick google search but didn’t see much information about this api. What is the header file? I’d like to give it a look and see if my application could use it.

Please check the document:
Jetson Linux API Reference: Main Page

The samples are installed to

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.