Real-Time Video Streaming & Remote Control for Person-Following Robot

I’m building an autonomous person-following robot where the Jetson Nano acts as a host, transmitting video and sensor data to a PC, which handles image processing (OpenCV + PyTorch) and SLAM. The PC then sends movement commands back to the robot.

Setup:

  • Jetson Nano + STM32 motor controller
  • Orbbec Astra Pro Plus (depth camera), 180° LiDAR (SLAMTech)
  • PC for real-time processing & control

Challenges:

  1. Low-latency video streaming from Nano to PC (best method: GStreamer, WebRTC, or ROS image transport?).
  2. Reliable bidirectional communication(ROS, WebSockets, MQTT?).
  3. SLAM on PC vs Nano – Will network delays affect real-time navigation?
  4. Optimizations for minimal lag and smooth control.

Would love insights from those experienced in real-time robot control & SLAM over networks!

Hi,
We would suggest use gstreamer to construct the use-case. You can quickly set up UDP or RTSP through gstreamer. Please refer to the examples in FAQ:
Jetson Nano FAQ
Jetson Nano FAQ