Horatio "Doc" Beardsley sits in a small, windowless room in the Entertainment Technology Center at Pittsburgh's Carnegie Mellon University, chatting away while he awaits a minor checkup. In a slightly blustery voice, he discusses his life experiences, describes his inventions, and answers questions, all with a corny sense of humor. "How old are you?" I ask. "I'm somewhere between dentures and death. More toward the death side," he answers.
With his big blue eyes, bushy gray beard and mustache, and creaky conversational style, he looks and sounds like an eccentric old scientist—exactly as his creators intended.Doc is a fake, a robot programmed to respond to spoken keywords with canned lines. He will start talking spontaneously after 6.5 seconds of silence, feign forgetfulness if he cannot match input to output, and generally bluff his way through the art of conversation.
Not long ago, computer scientists aspired to create silicon brains that could mimic the workings of the human mind. Doc Beardsley doesn't nearly meet those criteria, but his clever mix of animatronics, theater, speech recognition, and storytelling is remarkably effective at making visitors feel as if they are dealing with a conscious being.
Long before anyone develops true artificial intelligence, pseudo-smart robots may be taking orders in restaurants, helping handicapped people perform daily chores, baby-sitting kids, and keeping us from boredom and loneliness. Todd Camill, a research engineer at Carnegie Mellon's Robotics Institute, says robots endowed with this sort of synthetic intelligence could soon make their public debut as animatronic characters in theme parks and museums.
During a series of upgrades, Doc Beardsley got upper eyelids and more servo motors (left), along with realistic latex skin (right) to help audiences accept him as a living character.Photographs courtesy of Bill Mitas/Carnegie Mellon University
Doc Beardsley's life began about two years ago when Camill and Tim Eck, a recent graduate of the Entertainment Technology Center program, joined forces. They wanted to show that a robot could be a captivating performer even if it did not have massive brainpower. Their key goal was merely to create a convincing illusion of intelligence. "We'll do whatever it takes to make the audience enjoy the show. We're not above anything that others might consider dirty tricks," Camill says.
One trick is using off-the-shelf technology. Doc Beardsley has hidden wires connecting him to a pair of microphones, which allows him to gaze attentively in the direction of a speaker. A publicly available speech-recognition program called Sphinx, developed at Carnegie Mellon, enables the robot to recognize several thousand spoken words. Doc's software compares the words with a stored list of questions a person is likely to ask and selects a response that scores the closest match.
This method is a modern elaboration of Eliza, a computer-therapist program that enthralled students at the Massachusetts Institute of Technology more than three decades ago. The Carnegie Mellon researchers also use relatively low-tech, prerecorded sounds rather than voice synthesizers, because synthesized speech cannot produce the fine inflections needed for Doc's comic banter.
Another helpful shortcut is to use the audience as an ally. "We readily attribute intelligence to lesser machinery such as cars. The real issue is how we can engage humans in the process to establish believability," says Ronald Arkin, director of the Mobile Robot Laboratory at the Georgia Institute of Technology. He has literally written the book on this kind of benevolent deception: Behavior-Based Robotics. Camill and company are expanding on Arkin's lessons by studying how small test audiences—high school kids, visiting college students, or local volunteers—respond to Doc's antics. "We learned a lot about how people want to, and do not want to, interact with these characters," says Eck.
The first hurdle is getting people to react at all. "What do you ask a conversational animatronic character standing in front of you?" Eck says. He addressed the problem by stealing techniques from movies, TV, and theater, which use fade-outs and story arcs to let audiences know what to expect.
In a performance space at the Entertainment Technology Center, Doc hosts a game show called Big Time. The robot divides the audience into two teams, asks silly questions, and awards points based on the responses he hears. Once a man in the audience mentioned bread. That keyword called up a song lyric from Doc's memory file: "Tea, a drink with jam and bread," from The Sound of Music. Doc began singing, and suddenly the interaction clicked. "The guy responded, 'Oh, The Sound of Music! I love that movie,' " remembers Ron Weaver, another recent graduate who worked on the project. The movie reference then triggered a sentence about Austrian mountains, leaving the man feeling as if the robot had responded to him personally.
Stephen Jacobsen, chairman of Sarcos, is playing a lead role in bringing such theatrical robotics to the public. His company, located in Salt Lake City, builds animatronic machines for Universal Studios and for several of the Walt Disney theme parks. Unlike Doc Beardsley, the machines are not yet interactive, but audiences respond to them in a visceral way—perhaps because science fiction has conditioned people to expect a lot from robots.
His company's hyperrealistic animatronics play on those expectations and help reinforce them. Like Carnegie Mellon's researchers, Sarcos's engineers rely heavily on sleight of hand. A seemingly autonomous character might rely on a concealed 700-pound hydraulic supply for power, Jacobsen says.
Despite a surge of computer-generated characters, filmmakers continue to rely on animatronics to create many special effects. These physical characters raise the level of performance by a movie's human stars, says Stan Winston, a four-time Oscar-winning creature-effects director and creator who built the original Terminator, the aliens in Aliens, and the Teddy robot in AI: Artificial Intelligence. "Fifty percent of acting is reacting. An actor will give you a better performance when the actor he's acting with creates a better performance," Winston says. He is looking to improve that interplay by tapping eye-tracking software from MIT to keep the machines' eyes lined up with those of the actors.
Cynthia Breazeal, an assistant professor of media arts and sciences at MIT's Media Lab, says advances in robots built for entertainment will lead to parallel advances in sociable robots for many other applications. She notes that the RoboCup, a playful international robot soccer tournament, has fostered research into multi-robot cooperation.
Compared with her colleagues at Carnegie Mellon, Breazeal is focused less on near-term applications than on fundamentals of robotic behavior: how to make machines behave realistically in social situations and evoke normal human responses during activities such as leading a discussion or reading to a group of children. Her best-known project is an expressive robot called Kismet (see "The Robot That Loves People," Discover, October 1999). She and Winston are now collaborating to extend that work. Winston's studio is building a small, furry robot, and Breazeal's team is writing adaptive software to give the robot meaningful social skills.
Ultimately, researchers are probably taking different paths to the same destination. The more lifelike robots become, the more fun and useful they can be. Sony's robotic dog, Aibo, shows the potential of an interactive mechanical pet. By the end of the year, the company could begin selling a 23-inch-tall humanoid robot that engages in rudimentary conversation and responds personally to different users. A joint Carnegie Mellon-University of Pittsburgh team is developing Nursebot, a personable machine to aid the elderly. Someday the line between fake and genuine intelligence may begin to blur for real. "I'd love to see a robot that could improvise with you, or that has a sense of humor, or an intent," Camill says.
The Interactive Animatronics Initiative has an informative site that includes photos, videos, and sound clips: www.etc.cmu.edu/projects/iai.
Designing Sociable Robots by Cynthia L. Breazeal (MIT Press, 2002) is a somewhat technical introduction to her work. Breazeal also has a Web site devoted to her Kismet project: www.ai.mit.edu/projects/humanoid-robotics-group/kismet/kismet.html.