Hi,
I have a Jetson Nano and a RaspberryPi v2 camera. I have built a balancing robot and I want to mount the Jetson Nano on it in order to live-stream video wirelessly from the robot to other computers so I will be able to control it remotely by looking through its camera.
I used gstreamer and first tried encoding the video as jpeg. The latency was virtually zero but when I moved to settings higher than 1280x720 over 30fps it just didn’t catch up (unless I lowered the compression quality). Below are the pipelines I used:
Jetson Nano:
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1280, height=(int)720, framerate=30/1' ! nvjpegenc ! tcpserversink host=0.0.0.0 port=5000
Laptop:
gst-launch-1.0 -e -vvvv tcpclientsrc host=yinon-jetson port=5000 ! jpegdec ! autovideosink
I understood that jpeg encoding is only intra-frame and does not rely on other frames and therefore it compresses video poorly. Therefore, I tried encoding the video as h265. Indeed it worked better in terms of reaching settings such as 1280x720 over 60fps and 1920x1080 over 30fps but there was always a latency of about 300ms (which may be fine for many applications, but for controlling a robot remotely it is a royal pain in the a$$). Even if I lowered the setting to 352x240 over 60fps this latency persisted. Below are the pipelines I used:
Jetson Nano:
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1280, height=(int)720, framerate=30/1' ! omx2h265enc ! mpegtsmux ! tcpserversink host=0.0.0.0 port=5000
Laptop:
gst-launch-1.0 -e -vvvv tcpclientsrc host=yinon-jetson port=5000 ! queue ! tsdemux ! h265parse ! libde265dec ! autovideosink
I tried tweaking some of the parameters of the omxh265enc but none of them removed this annoying latency.
Therefore I want to ask the following:
- Is there any way to tune the h265 encoder in order to reduce latency?
- If not, is there a more suitable encoder/decoder?
Thanks,
Yinon