A Personal Robot Assistant Controlled with Eye-Tracking

Inspiration

Since the beginning of the pandemic, it is well known that older people face the more serious consequences of COVID-19. However, these same people are often in need of caretakers who can help them with day-to-day activities. For this reason, I created a personal robot assistant that can be easily controlled with the movement of the eyes.

What it does

This robot has the capabilities of replacing a caretaker’s responsibility while keeping the people it is caring for safe as well. For example, it can pick up and give medicine, feed, and provide water to the user; sanitize the user’s surroundings, and keep a constant check on the user’s wellbeing. It is able to drive in any direction, rotate its crane, raise its arm over high surfaces or lower the arm under low surfaces, and finally grasp on to objects. It is also fully controllable by just the user’s gaze!

How I built it

Robot: The robot has 4 main controls. The first is the drive, which is powered by 2 large servo motors, and enables the robot to move in all four directions. The second is rotate, which is powered by 1 medium motor and enables the robot to rotate the crane. The third is raise/lower, which is powered by 1 DC gearbox motor and enables the robot to raise or lower the arm. Finally, the fourth is open/close, which is also powered by 1 DC gearbox motor, and enables the robot to open or close the claw. An EV3 brick was used to control the servo motors and a Jetson Nano (with the help of an L298N Motor Driver) was used to control the DC gearbox motors. Additionally, a Raspberry PI V2 camera was connected to the Jetson Nano to relay what the robot sees back to the user. The main frame of the robot was built using Lego Technic pieces.

Code: All of the code was made in Python. I used the dlib machine learning library to locate the major facial landmarks. Taking the landmarks that surround the eyes, I was able to produce a mask of the eye area and then created an algorithm to extract the pupil of each eye. Additionally, I was able to ascertain when the user blinked using the position of the eye landmarks.

Using the coordinates of each pupil as input, I was able to calibrate the user’s eye movement to directions that could be used to control the motors. In the calibration process, the user followed a plus-sign around the screen, and their eye movement was recorded, stored, and used as a means of identifying in which direction other eye movements were looking in.

With blinks, the user also has the ability to switch between modes (eg. drive to rotate). Once the mode had been set by the user and a direction was determined, it was sent to a Flask server. In the server, I sent commands to the respective devices (EV3 Brick or Jetson Nano) that controlled the motors required to take action. Once the command had been sent to the respective devices, they relayed the messages back to the motors, which executed the required action.

Additionally, a streaming server was set up to relay camera data back to the user so that he/she could see what the robot is looking at.

Challenges I ran into

I faced many challenges throughout the project. At first, I wanted to use pneumatics to control the raising and lowering of the arm, as well as the opening and closing of the claw. However, it became too difficult to try to build and fit a linear actuator that could help convert the motor’s rotary motion to linear motion and move the syringe. Instead, I stuck to strings for these two tasks. Additionally, there were some challenges in getting the calibration to work, and it required a lot of logic and testing, but in the end, it was successful. Finally, bringing everything together was a big challenge on its own, but using a Flask server helped simplify the process.

Accomplishments that we’re proud of

I am really proud of being able to use eye tracking and gaze detection solely to control a robot!

What we learned

I learnt a lot more about the hardware side of things through this hack. I also learned a lot more about Flask servers and using them to connect IoT devices.

What’s next for A Personal Robot Assistant Controlled with Eye-Tracking

There are many next steps! For one, I could add a button control to the application in case the user is blind or unable to take care of themselves through eye movement, but the caretaker would still like to not expose them to any risk while taking care of them. Another thing I could add is a classifier that is able to detect objects around the room. The user can then choose an object (with their eyes) and the robot automatically retrieves the selected object. Finally, the robot can be programmed to remember previous actions in order to replicate them or reverse them when the time comes necessary.

Youtube Link: A Personal Robot Assistant Controlled with Eye-Tracking - YouTube
Github Link: GitHub - saranggoel/PersonalRobotAssistant
More Projects: Sarang Goel’s (saranggoel06) software portfolio | Devpost

3 Likes