Although physicist Neil Gershenfeld heads the Massachusetts Institute of Technology's new Center for Bits and Atoms, where he is plotting the future of the computer, he is no fan of today's gadgets. In his book When Things Start to Think, Gershenfeld imagines a technology-laden yet much simpler world where unobtrusive, helpful computers are incorporated into every aspect of our lives—clothes, furniture, even shoes. He describes his vision to Discover associate editor Kathy A. Svitil.
What's the mission of the Center for Bits and Atoms? The digital revolution has been all about computers, in that the world is now split into hardware and software, and different companies and people do each. Intellectually, there is a split between physical science and computer science. But many of the hardest and most interesting problems lie right at that boundary, where you can't separate hardware from software, physical science from computer science. CBA is addressing that boundary.
I'm interested in moving computing off the desktop and out into the world where we live. For instance, the Center for Bits and Atoms is developing tabletop processes that print mechanical structures: displays, actuators, and sensors. That could lead to a personal fabricator, analogous to a computer printer, that would let people take the malleability they've gotten used to in the world of computers and use it to shape the technology they want. A personal fabricator could make almost anything—a telephone, a computer, a refrigerator, a clock radio with the controls on the left side instead of on the right.
Where else might personal fabricators be useful? There is a surprising need for emergent technologies in many of the least developed places on the planet. While our needs might be fairly well met, there are billions of people on the planet whose needs are not. Their problems don't need incremental tweaks in current technology but a revolution.
We just deployed a field fabrication laboratory in rural India. There, the technologies developed by ordinary engineers aren't relevant to their problems, and they can't afford them anyway. For example, in a rural village there are a lot of diesel engines, such as in tractors, but people don't have ways to tune them up efficiently, the kind of thing you could get done at your corner gas station. So they asked us if we could use the technology to make a device that sets the timing of a diesel engine. It turns out to be very easy to make. In another project, we took a very simple imager, like a webcam, and turned it into a device to chemically analyze a water sample. We also developed a similar device to characterize the chemical quality of milk.
When will I be able to buy a personal fabricator of my own? Versions look like they are going to happen very quickly. Putting all the capabilities into a box like a printer—in other words, a personal fabricator that connects to a PC—may take five years.
Couldn't these personal fabricators put industry and manufacturing companies out of business? Nobody knows. But it is reasonable to project from the history of personal computing. There is no doubt that personal computing completely shook up the computing industry at the time. On the other hand, the existence of personal computing led to much larger economic growth than the original mainframe computing industry. The same thing could happen with personal fabrication.
You often talk about how machines could enhance our world. What does that mean? We have made prototypes of a computer that fits in a shoe and is designed to recover energy from walking, so it wouldn't need batteries. By manipulating tiny voltage changes in receivers at different parts of the body, we can turn the body into a network capable of transmitting data. So you could shake someone's hand, transmit data from your shoe computer, and exchange electronic business cards. Or take a more general example: One of our researchers is developing a paintable computer. He's put tiny chips in a liquid medium that you can literally paint on, so you can pour out a pound or an inch of computer power wherever you need it. If it doesn't work well enough, you add another coat of paint. Your whole house could become a computer.
I'm not particularly interested in virtual reality or creating alternate realities. Our physical reality is compelling enough. I think a much more interesting question is how to move computing off the desktop and infuse it into the world, not to replace but to augment the environment we live in. Little embedded instruments that can tell you what is in your water or milk and what your engines are doing—that is all computing. It is not a big desktop computer, and it is not something that you strap all your senses into. It is something that adds value to what you are doing.
Two years ago, we did a project with the Museum of Modern Art in New York City. Once a decade they do a show on architecture, on the kinds of spaces people are living in. This time they wanted to include a lot of supporting information, but the curators had forbidden the use of computers. You don't go to MOMA to hunch over a computer keyboard. So they wanted us to make the furniture, the physical environment itself, be the interface. So we instrumented a great big table in the middle of the gallery that allowed visitors to navigate through information—without apparent computing—by picking up what looked like little drink coasters.
The table was mobbed at the opening. At one point, an elderly museum benefactor came shuffling up, beaming. She pounded the table and she said, "This is great! I love this! Because I hate computers and there are no computers here." In the table there were literally about 400 micro-control centers, about 20 Internet nodes. It was more computing than she had ever been near in her whole life. But in a sense she was absolutely right: When you bring that much computing close to people, it does go away. There are so many little devices in the environment that the environment itself is the information access.
Can all this new technology be equitably distributed? What is important to understand is that the digital divide itself is not, in many ways, an economic divide. If anything it is a knowledge divide. The same inner-city homes in the United States that can't afford computers have very sophisticated computing in the form of video-game machines.
There are a number of examples of technology getting deployed. Rural India has a terrible phone system, but almost every village has cable TV, because the cable was deployed by micro-entrepreneurs who ran tiny little businesses using home-brewed satellite receivers. So we have lots of examples of advanced technology deployment across these divides.
Once we have computers built into our shoes and our homes, is the next step having computers built into our bodies? All the work on computer implants keeps coming back to the same conclusions: It costs a lot of money, it is very difficult to do, and it doesn't work that well. I don't see a compelling reason for it. Plus, I don't trust us to do it well, and it is a much bigger deal if a computer inside you crashes than if it happens to one outside you.
Even if they are on the outside, might smart machines eventually blur the line between what is animate and what is inanimate? This isn't going to suddenly change who we are. Some of my colleagues at the Center for Bits and Atoms recently showed that molecules in cells can be programmed to change their shape under radio signals. It promises to let us take control of cellular machinery to make the things we want them to make. On the one hand, real-time control of biological systems completely blurs the boundary between living things and machines. But people will still be people, and machines will still be machines.
Will this lead to machines that can actually think? I'll pass on that discussion. Whether or not anything is thinking is a very loaded question.
There is a strong historical parallel here: It used to be interesting to have steam engines race people. When they got faster than people, it was interesting to have them race horses. Then the engines got faster still. Now we have supersonic jets, but it doesn't mean we have gotten rid of horses. It just means that which is faster ceases to be an interesting question. In the same sense, reasonable people might differ about whether these machines are thinking. I could make a strong argument for it, but I think the question becomes irrelevant.
Do you have a favorite technology? It used to be a pencil and a piece of paper. Only this year have I stopped using a piece of paper as my PDA. It's now possible to carry around a little Linux computer—with a wireless network, a stylus, and a drawing surface—that gets close enough to the specs of a piece of paper to be worth using.
What technology could you most easily live without? The obvious one is e-mail, which everyone is completely inundated with. We are almost at the point where sending an e-mail message that doesn't need to be sent is considered as inappropriate as standing on someone's toes or yelling in his face.