Robotic Arm Control based on Human Gestures

Hi All,

glad to join you all on this awesome space, plenty of incredible and exciting projects!!
I would like to bring my contribution to the Forum, by sharing my project: a controller for a general purpose robotic arm, based on human gestures recognition (in particular, hands’fingers recognition).

As described in the GitHub repository, the basic idea is to use a Deep Neural Network trained to process 2D images acquired by a USB camera, recognizing human fingers and their gestures (e.g. “swipe” action).

The output of the neural network is used to command the movement of a robotic arm to pre-stored positions (in joint space).
The robot control software has been developed exploiting the NVIDIA Isaac SDK environment.
It has been deployed on a Jetson Nano platform, connected to both a simulated environment (based on NVIDIA Isaac Sim module) and a real robot (a Cobotta robotic arm from DENSO WAVE).
Here you can find a short video describing the whole application.

Depending on your requirements, you can use also use a Jetson TX2 (suggested option when connecting to a real robot, due to more powerful computational resources).

Any comment and/or suggestions for improvements are more than welcome!!
Thank you very much!!
Luca

Useful links:

1 Like