Nerve interface and deep learning allow amputees to control individual fingers and more

Hi everyone,

In this project, we design neural interface chips to create two-way communications with the peripheral nerve via microelectrode implants. We then use AI models based on recurrent neural networks (RNN) to read and decode the amputee’s intent of moving individual fingers. The real-time neural decoders run on an NVIDIA Jetson Nano attached to the hand. This system allows amputees to control individual fingers of a prosthetic hand with dexterity and intuitiveness that are not possible with commercial system

Here are photos of the amputees during model training. The nerve dataset is specific to each person. The amputee sits through training sessions where they flex each finger with the able hand while imagining doing the same movements with the injured/phantom hand. Input nerve data are acquired by the Scorpius, while ground-truth data are collected with a data glove. The model is then trained on the desktop computer with a TITAN Xp or a GTX 1080Ti.

image

Here are photos of the amputees during model training. The nerve dataset is specific to each person. The amputee sits through training sessions where they flex each finger with the able hand while imagining doing the same movements with the injured/phantom hand. Input nerve data are acquired by the Scorpius, while ground-truth data are collected with a data glove. The model is then trained on the desktop computer with a TITAN Xp or a GTX 1080Ti.

image

Photo of an amputee wearing the prosthesis hand. The system is completely portable and self-contained. There is no wired/wireless connection with any remote computer. The Jetson Nano only runs deep learning inference.

Furthermore, we can map the hand output to keystrokes on a computer so the amputee can play video games. An example key binding may be flexing thumb = move forward, flexing index = strafe left, making a fist = shoot… We had one amputee playing Raiden IV and Far Cry 5 with his thought (sorry for the potato pics). Imagine controlling drones, robots, virtual reality, and much more with your thought, the possibilities are endless.

image

More photos and videos:

  1. Amputee practicing with the AI-powered neuroprosthetic hand - YouTube (prosthetic hand practice)
  2. Amputee's remarks on the nerve interface and AI neural decoders - YouTube (amputee’s remark)
  3. Jules' Homepage - Gallery (my homepage, extra photos)

Paper series:

  1. [arxiv] A Portable, Self-Contained Neuroprosthetic Hand with Deep Learning-Based Finger Control (latest paper, Jetson Nano implementation)
  2. [biorxiv] Deep Learning-Based Approaches for Decoding Motor Intent… (optimize deep learning model)
  3. [JSSC] Redundant Crossfire: A Technique to Achieve Super-Resolution… (stimulator chip and touch sensory)
  4. [biorxiv] A bioelectric neural interface towards intuitive prosthetic control for amputees (nerve interface, Neuronix chip and early deep learning model)
  5. A High-Precision Bioelectric Neural Interface Toward Human-Machine Symbiosis (my PhD dissertation, complete narrative)

The team:

  1. Prof. Zhi Yang’s lab (University of Minnesota, my lab): neural interface chip, system integration, motor decoding experiments.
  2. Prof. Qi Zhao’s lab (University of Minnesota): deep learning architecture.
  3. Dr. Edward Keefer (Nerves Inc.): microelectrode design, clinical trial.
  4. Dr. Jonathan Cheng (UT Southwestern): implant surgery.

On behalf of the team:
-Jules Anh Tuan Nguyen
Email: nguy2833@umn.edu

2 Likes

This is definitely one of the more interesting and remarkable uses!

Wow I just discovered this post - I went through your paper a bit. Curious about “the Scorpius system” - is there any specific reason why you chose to implant the microelectrodes surgically instead of implementing them as wearables?

1 Like