Quadrotor flies with only three rotors using onboard vision sensors and computing on Jetson TX2

In this work, we have developed a flight controller + vision-based state estimator for controlling a quadrotor drone after losing one motor. The state estimator (Visual Inertial Odometry) uses FAST feature detector and KLT feature tracker as frontend and OKVIS as the backend. We demonstrate that, despite fast yaw spinning at 20rad/s after motor failure, the vision based state estimator is still reliable. We also test an event-based camera as the visual input. It shows that using an event-based camera outperforms a standard global shutter camera especially in low-light conditions. The entire algorithm runs on a Jetson TX2, without using CUDA.

Please check the code:

The video:

The paper