AIBO as Research Tool

Sony's AIBO robotic dog is a peek at a bizarre future when you won't know if a dog—or the pretty girl walking it—is flesh and blood or plastic and memory chips

By Christine Kenneally|Saturday, March 01, 2003
RELATED TAGS: ROBOTS



A complex array of joint-specific motors and gears gives AIBO a remarkable fluidity of motion that researchers describe in terms of "degrees of freedom." A train, for example, has one degree of freedom: forward and backward. AIBO has 20: one in the mouth, three in the neck, three in each leg, one in each ear, and two in the tail. The result is a robot with 1,000 of the movements a real dog has.
Photograph byGusto.

AIBO sits patiently on Frédéric Kaplan's office floor. It remains still when Kaplan leans over to switch it on, but it beeps to reassure us it's awake. Then it stirs. Raising its head, wagging its tail, and with all the poise of a Romanian gymnast finishing off a floor routine, it lifts, straightens, and stretches all four legs in unison.
    Visually, AIBO is balletic. Aurally, it's arthritic. It creaks and grinds as it raises itself to its feet. But the sound is inconsequential; AIBO is charming. It pads confidently across the floor, and when Kaplan, a young researcher at the Sony Computer Science Laboratory Paris, rolls a red ball before it, AIBO tracks the movement with its head. Good, AIBO, good! It wags its tail and— attaboy, AIBO!— pushes the ball playfully with a paw.
    At some level for the human observer, this robot is a dog. That's partly because AIBO— short for Artificially Intelligent roBOt— is a very clever piece of machinery, equipped with a 384 MHz computer processor that coordinates 1,000 different doggy moves, like extending its front paws and luxuriously stretching its back. But AIBO also looks like a dog because I'm human, and humans will believe pretty much anything.
    We are wired to see life where it isn't, to impose intelligence where there is none, and to have a wide range of emotional responses to our misguided perceptions. From our ability to bypass disbelief comes anthropomorphism as well as art. Think of all the talking animals in Aesop's fables. Think Miss Piggy. Cognitive scientists are expert at pinpointing all the ways we can make symbolic interpretations or be completely fooled. But what about the perceptions of other animals? What do dogs think of a dogbot?
    Recently scientists have been asking themselves just that. When çdám Miklósi, a Hungarian ethologist who works with dogs and tamed wolves, first came across AIBO, he was struck by the possibilities. He and graduate student Enik› Kubinyi, both at Eötvös Loránd University in Budapest, contacted Kaplan, and from this meeting arose a series of AIBO-dog experiments with the dual goals of using the real dogs to help them discover better ways to program the robot and using the robot to help them explore the dogs' species recognition: What makes a dog recognize another dog as a dog?

Sony researcher Frédéric Kaplan trains AIBOs from two generations (the one on the left is an older model) to recognize an unfamiliar ball by waving it in front of them. "Each robot develops capabilities unique to itself based on its perceptual and social history," he says.
Photograph by Darin Mickey.

    Even before Kaplan and Miklósi introduced AIBO to Fido, a handful of animal researchers in Europe, the United States, and Japan were busy conducting experiments with their own animal robots, including a flirtatious bowerbird and a furry white ratbot. Animal behavior researchers have used a variety of dummy animals before, ranging from simple red balls of cotton wool designed as stand-ins for robins to elaborate stuffed lions that spooked their real counterparts on the Serengeti last year. But the latest animal robots are more than decoys. They are sophisticated tools that provide a new scientific perspective on the animal kingdom.
    When a real animal interacts with an animalbot, and that bot is controlled by a human, it's as though the human has donned a scale-size animal suit and disguised himself as a bird or a dog or even a bee. Because of this, biorobotics behavior experiments promise to reveal intricacies of animal relationships and perceptions that we could only guess at before. The AIBO-dog investigations are especially interesting because of the intimate relationship that exists between humans and dogs. The two species have lived together for more than 14,000 years, and in that time domestic dogs have changed themselves in fundamental ways in order to get along better with us. Dogs aren't just friendly wolves; they think differently, they see the world differently, and they have a new kind of mind because they hang with humans. Exploring this domesticated mind with robots like AIBO allows us to bypass some of our innate human biases. And getting at the real dog's-eye view may end up telling us as much about ourselves as about them.

The startled yelps captured on film at the end of one of the AIBO experiments are those of the researchers, not the dogs. At the beginning of this film, AIBO stands and then plods guilelessly toward a red plate piled with meat. It is the color of the plate that attracts the dogbot; AIBO is programmed to follow moving red objects with its gaze and to move toward stationary red objects. But it is the food, not the dinner service, that has drawn the Belgian shepherd already crouching on the other side of the plate. AIBO plods, and the shepherd growls. Any other dog would pause at this warning, but the robot just walks ahead, and the shepherd jumps over the plate to attack. Before it could be bitten too hard or suffer any serious damage, Kaplan and his colleagues rushed in to pull AIBO from harm's way. The shepherd got the meat.
ROBOT MENAGERIE

RATBOT: When the clump of fur (left) covers a ratbot (center) that has been programmed to head to a food source, a real rat (right) will chase it. The ratbot has two arms that allow it to imitate a real rat's standing posture and an FM receiver for communication with a host computer.
Photograph courtesy of Atsuo Takanishi/Waseda University.

BEEBOT: Although the beebot towers over real bees, they follow its "waggle dance" in quest of nectar. The center rod is connected to an X-Y plotter that makes a figure-eight path. The razor blade mimics a vibrating wing, and the plastic sleeve delivers scented sugar water to the bees.
Photograph courtesy of Axel Michelsen.

BIRDBOT: The movements of a robotic bird placed in the bower, or courtship arena, of a male satin bowerbird are remotely controlled by a researcher in a nearby blind. The female bot can be shifted from an upright to a fully crouched position (shown above), signaling readiness to mate.
Photograph courtesy of Gail L. Patricelli.

    AIBO was attacked outright a couple of times in the experiments. (It was also chewed on, bitten, and pushed over.) While this was not the exact response the researchers were after, it was the type of response they were glad to see. Such a reaction, says Miklósi, indicates that some of the dogs see AIBO as more than just a machine.
    Identifying another animal as alive, and beyond that, recognizing another dog as a dog, or another human as a human, appears to be wholly instinctive. In fact, it's a complicated combination of identifying certain sights, smells, and behaviors. For each species, the relative contribution of each of these will be somewhat different. The fundamental question driving the AIBO experiments was: Would the dogs recognize AIBO as a living creature? If the answer was yes, the scientists would be one step closer to working out which elements are most important for species recognition.
    Sorting out the tangle of physical senses and psychological expectations in species recognition will be the work of decades, but Kaplan and Miklósi made a start by looking at what it took for their test dogs to make a few basic distinctions: Is this object alive? Does it look like a dog? Is it worth approaching like a dog?
    The scientists set up the experiment using 40 pet dogs (the advantage of which, says Miklósi, is that their owners feed and groom them and also take them home at the end of the day). The test subjects included 24 adult dogs and 16 juveniles. The team put each dog in a room (with its owner but no other dogs), and once ready, AIBO was sent in to see how the dog would react.
    The scientists also sent in a real puppy and a toy car, one at a time. The puppy provided an obvious control. The car was used to see if the dogs would react differently to a machine that looks like a dog and a machine that doesn't resemble any living thing but moves at dog speed. In another trial, Kaplan and Miklósi introduced AIBO to the dogs while it was covered in fur that had been stored in a real puppy's sleeping box so it would smell more like a dog.
    All four conditions (AIBO, real puppy, toy car, furry AIBO) got the interest of the dogs, but there was a clear ranking of responses. The puppy usually got the most interest from the dogs. The adult hounds were quicker to approach and investigate both the furry bot and the real puppy than either the non-furry robot or the car. Even though the dogs still showed a preference for the non-furry AIBO over the car, the addition of fur evoked the strongest response of all the artificial stimuli.
    The team also measured when the dogs growled at their guests. Again, the dogs responded to the real puppy and the furry AIBO in a similar way, growling at both in most situations, but not as much at the other objects.
    Introductions between canine and machine took place in two basic experimental settings, one with food and one without. The food seemed to make the identity stakes higher. With meat around, the dogs growled a bit at the furry robot but much more at the real puppy, perhaps recognizing that one posed more of a threat to their meal than the other. Nevertheless, the adult dogs still showed more interest in both the real puppy and the furry AIBO, investigating them sooner and for a longer time than the other objects.
    The strongest distinction between AIBO and the toy car was made by the younger dogs when there wasn't any food around. From an objective standpoint, AIBO has far more in common with the car than it does with another dog. Still, the young dogs growled at the robot dog (whether or not it had fur) and also at the puppy but not at the car.
    Did the dogs see AIBO as another dog? The short answer is yes. They approached it more the way they did the puppy than the way they approached the car. Although the dogs worked out pretty quickly that AIBO wasn't another dog (it didn't move fast enough), at the outset its basic body shape, the way it moved, and especially the fur piqued their interest.
    So much for the romantic idea that canines have primitive smarts that humans, in all our civilized glory, have lost. We assume that dogs have the natural ability to smell the real from the fake, but it's beginning to look as if they can be fooled in the same kinds of ways that we can.
    And it's not just dogs. Other scientists are disguising themselves, like wolves in sheep's clothing, and learning what it takes to cheat all sorts of animals. Unsurprisingly, the less like us the animal is, the less similarity there is between what it takes to pull the wool over our eyes and what it takes to pull the wool over theirs, if they have any.


Kaplan describes AIBO as "a computer on legs that thinks more with its body than its head." A central processing unit in the body is the robot's brain, which sorts through input from pressure sensors on the head, back, chin, and paws, as well as from a gyrometer that helps AIBO maintain its equilibrium. A smaller circuit board in the head handles rapid image preprocessing and is fed by a color camera and an infrared distance sensor.
Photograph by Gusto.

In the late 1980s and early 1990s, Axel Michelsen at Odense University in Denmark ran a series of experiments using a robot bee. The beebot was made of a brass body and some wires covered in beeswax and connected to some motors. It was 13 mm long (the same as a worker bee) and 5 mm wide (a little broader). A cheap razor blade broken in half stood in for the wings.
    It didn't look like a bee; it didn't feel like a bee; and before it was put in a hive, it didn't smell like a bee. From a human perspective, the beebot was little more than a small switch with wire sticking out of it. But it worked pretty well.
    The beebot was placed on the hive's dance floor and programmed to simulate a complicated "honey dance." Bees use this dance to map the location of food for one another. The overall pattern of the dance is two adjacent ovals connected by a straight line called a waggle run, during which a female bee swings her abdomen back and forth like a pendulum, 13 to 15 times a minute. The upward angle of the line represents the direction of the nectar in relation to the sun, and the duration of the waggle run indicates distance.
    This odd-looking beebot was acceptable to the hive's inhabitants for a number of reasons. The inside of a hive is completely dark; bees don't see the dance at all. Bees don't hear the dance either. "There's no indication that bees can hear in the traditional sense of the word," Michelsen says. "They don't have a pressure-sensitive ear." So for the residents of the hive, the beebot was more bee than bot because of its dance moves.
    Other researchers have argued that bees decipher the honeycomb vibrations generated by the dance to determine the location of nectar. But Michelsen's experiment suggests the bees are responding to oscillations of air created by the dance moves. In any case, bees decoded his waggling robot's dance correctly and found the scented baits that he had planted.
    Did the bees think the bot was another bee? For all practical purposes, they accepted it. This gave Michelsen an opportunity to confirm what prior research about the nature of the dance could only suggest: Some moves, like the upward wagging run, conveyed more information than others. And in the same way that the dogs responded differently in different situations— growling more at their metal test partners when food was at stake— Michelsen believes that acceptance of the beebot depended on the context. He attributes some of the experiment's success to the fact that it was low season for nectar. In high season, when there's a lot of dancing to compete with, he says, the real bees would not have paid much attention to the fake bee.
    Barbara Webb, at the Center for Cognitive and Computational Neuroscience at Stirling University in Scotland, recently published the first major overview of biorobotics in animal behavior. After surveying the field, she concluded that robotics can offer unprecedented insights into both the physiological and behavioral workings of an animal.
    Robot builders have used biology for inspiration and ideas for a long time. But now the exchange is becoming more two way. The new systems being built, says Webb, "answer questions for biologists as well as engineers." So great is the potential of robotics to answer questions in animal behavior research, she says, it may amount to an entirely new methodology that promises to be incredibly useful because robots must operate in the real world, responding to challenges that real animals respond to (walking, seeing, being bitten). When you build a bot and repeatedly test it out, Webb says, "you learn something about the real environment that you can overlook very easily."
    In addition to exposing the intricacies of species recognition and communication, robots can be used to investigate predator-prey relationships, as well as attack responses and mating habits in mammals, insects, and birds. For her Ph.D. at the University of Maryland, Gail Patricelli took a birdbot into the Australian rain forest to analyze the sexual selection of bowerbirds. Like Michelsen's beebot, Patricelli's robot was built from scratch. She sat down with a mechanical engineer and watched videotapes of real bowerbirds. They broke down the movements into different planes that could be re-created with servomotors and then built a metal frame with a computer chip that responded to a remote control.
    Patricelli and her research team spent a month and a half in a little shack in the middle of the rain forest trying to find ways to drape the skin of a real female bowerbird over her metal creation. "I worked on every little feather for a long time," she says. She adjusted the mechanics, trimmed down the metal, and moved around the motors. Finally, she wove craft wire through a plastic mesh typically used for Christmas ornaments and used it to attach the skin to the metal.
    She then set her battery-powered bot in a bower and controlled its movement from a blind yards away. The female birdbot was able to fluff up its feathers, move its head, and crouch by rising up and tilting forward; crouching signals to a male bowerbird that a female is ready to mate. Because the robot allowed Patricelli to simulate female behavior, she was able to measure the male response.
    Patricelli's robot showed with its movements that female bowerbirds control the intensity of male mating displays. If the female indicates, with fluffing and crouching, that she's comfortable and won't be scared away, the male increases the intensity of his mating behavior (puffing of feathers, extending the wings, running, and making buzzing noises). If the female reduces her crouching, the male reduces the intensity of his display in response. The males who modulated the intensity of their displays in response to the birdbot also tended to be the most successful in mating with females in the wild.
    Gerald Borgia, Patricelli's adviser and a coauthor of a bowerbird paper in Nature, said that in these intricate courtships "you don't know if the males are driving the females or the females are driving the males. But if you can take one end of it and control it, you can get a better idea of who's driving what." Borgia and Patricelli wanted to use a male robot as well, but male bowerbird behaviors are too complex to reproduce with today's robot technology.
    What does it say about bowerbird perception that the males observed in the experiment thought the robot was realistic enough to mate with? "Sperm is cheap," Borgia says. Patricelli, now a postdoc at Cornell University in New York, laughs at the question. "I consider it a compliment, actually." She adds, "My cat didn't fall for it. She just stared."
    This split between different species' perceptions is one of the most interesting parts of the robot behavior experiments. It reveals, among other things, what different animals consider significant and what's beneath their notice. Sometimes, two similar responses are inspired by completely different perceptions. Female bowerbirds are very important to male bowerbirds, and the birdbot met enough of the males' mating criteria, so they considered it real. A female bowerbird is of far less importance to the casual human observer, and it is perhaps because of this indifference that the robot also fooled a number of humans. On the other hand, the cat, whose intentions are predatory, may need to see different qualities and different behaviors to bother expending energy on the robot.

While taking a stroll in the laboratory, AIBO gradually senses the presence of a ball it has seen before. After briefly attempting to push the ball with its paws and head, AIBO is distracted and decides to lie down and shut itself off.
Photographs by Darin Mickey.

Until recently, scientists have had limited stimuli, like videos or dummies, to assess what matters to dogs. Of course, you don't take a species out for its daily walk for 14,000 years without being able to claim some empathic understanding. But that sort of observation and introspection often gets us into trouble. Do we see what we think we see because it's really there or because we can only understand the motivations of other animals through the muddy filters that make us human?
    For that matter, we don't really know how it works in reverse. When dogs observe us, do they understand us in the same way we understand ourselves, or do they make "caninopomorphic" assumptions about the behavior and motives of humans?
    By using AIBO, Kaplan and Miklósi removed some of these innate biases typically brought to understanding dog thought. In their efforts to measure dogs more scientifically, they are at the forefront of a trend. Marc Hauser, a Harvard University biologist and the author of Wild Minds, says, "a lot of the work that's been done on dog cognition has not really been careful science until maybe the last five to 10 years."
    It used to be that dogs were ignored scientifically because no one could untangle their relationship with humans. Now this bond is precisely why they're considered interesting. "Dogs are back!" says Miklósi. "They were really abandoned for the last 20 years because people thought that interesting animals were in the wild and that you have to go to Africa to do exciting stuff. Now people are finding out that dogs are very important and worth investigating."
    In their AIBO experiments, Kaplan, Miklósi, and Kubinyi learned that species recognition is not completely instinctive but something dogs acquire with experience. The puppies in the AIBO experiments showed more interest in all the test partners, whereas the older dogs were more discriminating between live test partners and nonliving partners.
    The crucial factor in the dogs' eventual loss of interest was AIBO's lack of speed. The dogs would approach, make a play gesture, and wait a second. Because AIBO didn't respond, they gave up straightaway. If AIBO started to move again, the dogs showed interest again, but the same thing would happen. The bot just didn't interact. The lesson is this: If you want to be accepted as a dog, you have to move at a dog's pace and respond to dog cues, such as play behavior.
    The scientists also confirmed that the dogs' species recognition was not an automatic judgment. Dogs use several senses, including vision, hearing, and smell, to identify other dogs as dogs. They approach each other with a particular orientation and then use their senses hierarchically, switching from one to another in greeting. The visual, like basic body shape and movement, piques their interest, then the sniffing commences. First a sniff at the rear end: Are you a dog? Then a sniff at the front end: Are you a dog? And on it goes.
    Of the senses, vision mattered more in the AIBO experiments than most people might expect. "Dogs are not 100 percent smell, in contrast to what people believe," says Miklósi. He argues that while dogs have excellent olfactory abilities, they compromised some of these natural talents in order to live with humans. Basically, he says, "humans are a dog's environment." Although they still smell as sharply as their wolf forebears, dogs have also developed the ability to interpret visual cues from species other than their own. This is one of the legacies of their evolution in a domestic setting; it's more useful to observe communicative signals from humans than it is to smell them.
    These findings are clues to how much we can learn about ourselves from studying dogs. Because we are the dogs' environment, their evolutionary changes emphasize just how important visual information is to us.
    It may be that some qualities of our behavior or environment are as fundamental to us as vision, but we may not have sufficient perspective to know that. If we can examine ourselves from a distance, in this case from a canine point of view, such traits may come to light and surprise us.

The dusty remains of Jean-Jacques Rousseau lie in a crypt two blocks from Sony's labs in Paris. "Rousseau!" Kaplan exclaims, shaking his head at the legacy of the 18th-century philosopher. As the father of the romantic movement, Rousseau had a talent for glorifying nature. He advocated passion over reason and believed that humanity's truest and happiest state was a primitive one. Civilization was considered a corrupting force. "Romanticism was terrible for the machines," says Kaplan ruefully.
    But like its founder, the movement against machines is truly dead. Animal-like machines may unlock what Rousseau valued most— the mysteries of the natural world and the uncivilized mind. And although Miklósi was ultimately disappointed in AIBO's performance in the laboratory— a really effective dogbot will have to move faster— his experiments show the way to new horizons of scientific knowledge by giving us the chance to inhabit another animal's hide.




The Sony Computer Science Laboratories Paris Web site describes their AIBO research and features a video clip of the Belgian shepherd attacking AIBO: www.csl.sony.fr/Research/Experiments/DogAIBO/index.php.

At www.aibo.com, Sony's slick AIBO consumer site, view the full AIBO lineup along with the accessories and software that can be purchased to enhance the robodogs.

A paper by Barbara Webb explores the usefulness of robots in animal behavior studies. "Can Robots Make Good Models of Biological Behavior?" Behavioral and Brain Sciences 24 (6) 2001. An abstract is available at www.bbsonline.org.

Learn about Barbara Webb's current research at Stirling University in England, including work using robots to model crickets: ganglion.stir.ac.uk.

NOVA Online interviewed Gerald Borgia about his bowerbird research, including the robotic bowerbird created by Gail Patricelli: www.pbs.org/wgbh/nova/bowerbirds/trail.html.

There are more photographs and information about the ratbot on Atsuo Takanishi's Web site. In addition to the ratbot, the lab has worked on a flutist robot and a dental robot. Check them all out at www.takanishi.mech.waseda.ac.jp.
Comment on this article
ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

ADVERTISEMENT
ADVERTISEMENT
Collapse bottom bar
DSCJulyAugCover
+

Log in to your account

X
Email address:
Password:
Remember me
Forgot your password?
No problem. Click here to have it emailed to you.

Not registered yet?

Register now for FREE. It takes only a few seconds to complete. Register now »