AI Is Revolutionizing Prosthetic Arm Control

Advances in the fine motion capable with prosthetic arms has not been matched by in users' ability to control them. AI systems are changing

AI
(Credit:Blue Planet Studio/Shutterstock)

Newsletter

Sign up for our email newsletter for the latest science news
 

Prosthetic hands and arms have dramatically improved in recent years, thanks to advances that allow independently moving fingers, control over multiple joints, personalized 3D printing and so on.

Despite these breakthroughs, most users find prosthetic arms difficult to control. The most common control mechanism records the electrical activity in the arm muscles – a technique known as myoelectric sensing—and then uses this to actuate the prosthetic.

The problem is that users usually have to contract their muscles in specific combinations of patterns to generate hand or wrist motions. These patterns are often counterintuitive, time consuming and frustrating.

At the heart of this problem is that neuroscientists do not know how to accurately decode the signals the brain sends through nerves to control muscles. And that makes it hard to interpret nerve signals accurately.

What’s needed is a way to measure and decode nerve signals so that they can be used to intuitively control prosthetic arm, hand and finger movement.

Translating Intention

Now Diu Khue Luu and Anh Tuan Nguyen from the University of Minnesota with colleagues, have found a way to do this using an AI decoder that learns the user’s intention based on the nerve signals it senses in the arm. “We present a neuroprosthetic system to demonstrate that principle by employing an artificial intelligence (AI) agent to translate the amputee’s movement intent,” they say.

One key difference between this and electromyographic systems is that the team record the nerve signals via an implanted electrode called a peripheral nerve interface. That requires minor surgery and comes with obvious risks of infection, tissue damage and so on.

The advantage is a much cleaner signal. “These interfaces aim to enable intuitive prosthesis control purely by thoughts and achieve a natural user experience,” say the researchers. With this in place, the team record amputees’ nerve signals as they attempt specific hand movements. And the team have tested the approach with three subjects who each received the microelectronic implant for a period of up to 16 months.

To train the AI system, the user wears a data glove on the uninjured hand and then repeatedly practices a hand movement in this and the phantom hand. During this process, the data glove records the intended movement while the electrodes record the nerve signals in the amputated arm.

In this way, the AI system learns to correlate the patterns of nerve signals with specific hand movements. In particular, it can decode several movements at the same time, such as pinching which requires the thumb and forefinger to flex while the other fingers remain still. The AI then manipulates the prosthetic arm by Bluetooth connection.

The result is a remarkable degree of dexterity. In tests, the subjects successfully achieve the intended action 99.2% of the time with a median reaction time of 0.81 seconds. Crucially, each movement is intuitive with the AI system matching the intended motion. “The AI agent allows amputees to control prosthetic upper limbs with their thoughts by decoding true motor intent,” say Luu, Nguyen and co.

Tactile Sensation

The team say this system is part of their broader work on signals that flow in both directions, from the user to control the arm, and then back to the brain for tactile sensation. They point to earlier work in which users reported tactile sensation and proprioception (the body’s ability to sense motion, action and position in 3D space). “This and our previous works form the foundation to materialize a complete closed-loop human-machine bidirectional communication,” they say.

The approach also allows users to connect directly to a virtual environment to control a computer game for example, without using a prosthetic hand. “The proposed nerve interface with an AI neural decoder allows people to manipulate remote objects using only their thoughts in an actual “telekinesis” manner,” say the team.

Of course, the system isn’t perfect. The researchers point to various kinds of fine motor control that are still difficult to achieve, such as wrist extension, wrist supination and applied force estimation. Some of these movements will require additional implanted electrode sensors. A shorter reaction time would be appreciated by most users. And implanted electrodes always run the risk of being disturbed and requiring re-implantation.

For now, this approach is a useful step forward in the technology for amputees and an exciting avenue for future work. As Luu, Nguyen and co conclude: “These results promise a future generation of prosthetic hands that can provide a natural user experience just like real hands.”


Ref: Artificial Intelligence Enables Real-Time and Intuitive Control of Prostheses via Nerve Interface: arxiv.org/abs/2203.08648

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 LabX Media Group