John Donoghue, the director of the Institute for Brain Science at Brown University, could not contain his excitement. For years he had been working on a revolutionary method to pick up brain signals from paralyzed patients and translate them into commands to move mechanical limbs. If all went well in this experiment, Cathy Hutchinson, a 58-year-old woman who lost the use of her limbs in a stroke, would control a robotic arm and hand and use them to lift a bottle of coffee to her mouth—just by thinking. "Guys," Donoghue told his collaborators, "buy the most expensive camera we can afford and shoot this in high-definition. This is a historic moment."
And so it was. Hutchinson sipped coffee from a bottle, the first time she had served herself in 14 years and the first time a person had ever guided a robotic limb with her thoughts. The achievement was reported in a May 2012 Nature article.
In a sense Donoghue, 63, has been building up to this moment all his life. As a child, he suffered from Leggs-Calvé-Perthes disease, which prevented him from walking for two years. In his first job after college, he worked at the Walter E. Fernald State School, an institution for the mentally handicapped. "I was looking at brains in the lab and then looking out the window at people who had brain diseases that completely took away their humanity, their ability to interact," he says. "I've been trying to understand what the brain is doing because to me the brain is the organ of our humanity. It gives us our mental life, and that makes us what we are." DISCOVER senior editor Kevin Berger spoke with Donoghue in his Brown University office.
You have found a way to help paralyzed people by converting brain signals into computer code to maneuver a robotic arm. How does that work?
Neurons in the brain create electric signals. When the neurons are sufficiently tickled by inputs, they fire electrical impulses called spikes. We have a simple tool to record those spikes, the microelectrode, which has been around since the 1930s. EEG [electroencephalography] electrodes record the neurons' activity from outside the head or on top of the cortex, but the resolution is blurry. It’s sort of like listening from the Goodyear blimp to a crowd in the sports stadium. You need to drop the microphone right next to people's mouths to really hear them.
So you created that neural microphone—a silicon chip with 100 electrodes implanted in the brain—and connected it with wires to a computer. Then what do you do?
Once I have the microelectrode array in your motor cortex, the brain's command center for movement, you watch a cursor on a video moving left and right. I then say, "Imagine you’re doing that by moving your hand on a mouse." As you imagine doing this, and the cursor moves to the right, I record the number of spikes, and it's five. And when the cursor moves to the left, it’s two. So now I have a coding model. Five means right, two means left.
You can tell what I want to do by recording a single neuron? But we have 100 billion neurons in our brain!
That's what's so remarkable. The brain operates over broad networks. There's a tendency to think you've got one neuron that’s saying "left." But if that one cell dies, it doesn’t make any difference, because the message is distributed over many neurons. Of the many million neurons in the motor cortex, most of them have some kind of information about leftness. Now, the code for a single neuron is not so simple. Sometimes imagining left might produce two spikes, sometimes four. It's variable. So we average a set of neurons together. With [my patient] Cathy, we were using a few dozen neurons, and the computer decoded the likelihood that they were signaling "go left" or "go right."