Robottle, an Autonomous Driving Robot equipped Lidar (SLAM) and RPI-Camera to detect, collect and retrieve plastic bottles in Random Environment

As part of a 4 months long semester project at EPFL, in a team of 3 students, we entirely designed and developed an autonomous driving robot that featured states of the art technologies to travel in a random environment with one goal: collect as many plastic bottles as possible in an elapsed of time, and bring them back to a “recycling area”. Feel free to watch this youtube presentation of the project: https://youtu.be/XJpJSuhSZN4 and to explore the github repository with full documentation, and many figures explaining better than words! : GitHub - arthurBricq/ros_robottle: Repo with the ROS code of the Jetson Nano (there’s even a report which explains every single detail of the robot)

Robottle made use of a LIDAR to iteratively construct a map of its environment using a very powerful Bayesian algorithm: SLAM (Simultaneous Localization and Mapping). Though it is computationally very expensive, we could implement an open-source version of this code written in C, with only few modifications to the original library and obtain great mapping results.

On top of this, the Robot used a raspberry pi camera with a Deep Neural Network (coco-bottles) ran on the Jetson GPU to detect the bottles near the bottle (extensive documentation from the jetson-inference project was very helpful, also for this part we didn’t touch a lot the original code) . It was possible to use the bounding box to extrapolate the angle at which bottles were oriented, and time of flight sensors at the front of the robot to really tell when a bottle is within “arm range”.

As seen in the video, bottles were picked using a sticky arm and it turned out to be quite effective (about 4 bottles out of 5 collected).

All this information was merged using ROS2, working perfectly well on the Jetson Nano ! The high level logic was written in Python and took the form of several Ros Nodes working together. And the low-level logic (control of the 4 motors) was done with an Arduino Mega, connected to the Jetson in UART.

At the end of the 4-5 months, our robot proved to be very efficient ! Though there would be still improvements to be made on the robustness (it was hard to have the robot autonomous for more than 10 / 20 minutes)

Though this project is definitely not so easy to reproduce, I still though it was worth sharing !

Any feedback is welcome :)

1 Like