I’m building an autonomous person-following robot where the Jetson Nano acts as a host, transmitting video and sensor data to a PC, which handles image processing (OpenCV + PyTorch) and SLAM. The PC then sends movement commands back to the robot.
Setup:
- Jetson Nano + STM32 motor controller
- Orbbec Astra Pro Plus (depth camera), 180° LiDAR (SLAMTech)
- PC for real-time processing & control
Challenges:
- Low-latency video streaming from Nano to PC (best method: GStreamer, WebRTC, or ROS image transport?).
- Reliable bidirectional communication(ROS, WebSockets, MQTT?).
- SLAM on PC vs Nano – Will network delays affect real-time navigation?
- Optimizations for minimal lag and smooth control.
Would love insights from those experienced in real-time robot control & SLAM over networks!