#57

Mind Reading at a Higher Level

Improved neural prosthetic focuses on the goal of the movement instead of the individual steps.

By Lacy Schley|Monday, November 30, 2015
mind-reading
mind-reading
Erik Sorto enlists a brain-controlled robotic arm to help himself to a drink.
Spencer Kellis and Christian Klaes/Caltech

People with paralysis or an amputation can already use their minds to control robotic limbs, helping to restore their sense of independence, but the motions are often clumsy and unnatural. Researchers announced in May that they created a neural prosthetic that gives those with artificial limbs finer, smoother movements.

Standard neural prosthetics ferry signals from the brain’s motion control center, the motor cortex, to a cable connected to a computer controlling the limb. These signals break down a physical task into individual movements — like listing the steps involved in grabbing your coffee mug. But this team went further upstream in the brain’s signaling chain and used signals from a patient’s posterior parietal cortex (PPC).

electrode-array
electrode-array
Researchers use an fMRI scan to place a pair of small electrode arrays in the brain. Each electrode in the array (above) records the activity of a single neuron. A system of computers processes the signals, decoding the person’s intent.
Caltech
The PPC is where your brain determines “the goal of the movement,” says principal investigator Richard Andersen of Caltech. In other words: “I want to grab my coffee.”

After surgeons implanted the prosthetic in a quadriplegic patient, he could use a robotic arm to shake someone’s hand and even hold a glass steady enough to drink from it on his own.

Next up: Anderson plans to integrate touch and position sensations.

ADVERTISEMENT
Comment on this article
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
DSC-CV0417web
+