Emerging Technology

Are you ready for computers that speed up the process of evolution and teach themselves to think?

By Steven Johnson|Friday, August 01, 2003
RELATED TAGS: COMPUTERS, GENETICS



Illustration by Leo Espinosa

On the screen, an animated figure takes a step forward and tries to walk. Instead it collapses immediately, falls on its back, and flails its legs helplessly. Then it reappears at the left of the screen, takes a few delicate baby steps, and falls again. Returning to the screen, it raises its knees, takes six or so confident strides, and drops on its side. After trying over and over again to walk, the figure finally marches successfully across the screen as though its motions had been captured directly from videos of a human walking.

This little film won't win an Oscar for Best Animated Short, but the software that generated it stands as a small miracle of computer programming. The figure was not taught how to walk by an offscreen animator; it evolved the capacity for walking on its own. The intelligence to do so came from some clever programming that tries to mimic nature's ability to pass along successful genes.

The idea is called a genetic algorithm. It creates a random population of potential solutions, then tests each one for success, selecting the best of the batch to pass on their "genes" to the next generation, including slight mutations to introduce variation. The process is repeated until the program evolves a workable solution. Originally developed in the 1960s by John Holland at the University of Michigan, genetic algorithms are increasingly being harnessed for real-world tasks such as designing more efficient refrigerators.

Genetic algorithms make it possible for computers to do something profound, something that looks an awful lot like thinking. And that little animated figure learning how to walk showcases some design developments that permit computers to make their own decisions—without guidance from humans.

The payoff is immediate and obvious for creators of popular entertainment. Most big-budget Hollywood movies or action-oriented video games are teeming with walking (and running and jumping) computer-rendered figures. For these characters to seem believable, they have to move in convincing ways, which means that somehow they have to be taught how to walk. Until recently, filmmakers either had to instruct each limb to move in a particular way or they had to map in three dimensions a real person's movements and apply that information to a virtual character. You can see the approach in the way the character Gollum moves in Lord of the Rings: The Two Towers. That laborious approach creates convincing results, but they're notoriously inflexible. If animators record someone walking downhill for one scene, and then decide later that the character needs to trip over a rock along the way, they have to go back and choreograph the whole sequence all over again.

Instead, Torsten Reil, an Oxford researcher turned animation entrepreneur, decided to borrow a page from nature and use the power of evolution to solve the problem of making a digitized character move convincingly. "First, we created a simple stick figure: It's got gravity; it's got joints," he explains. "Then we put virtual muscles in and a neural network that controlled the muscles. The problem is: How do you get the network to do what you want it to do? If you just have a randomly assembled neural network, it will send quite complex signals to the muscles, but that's usually not walking—it's more like some random twitches." The muscles all work, and they're wired up to the central nervous system, but the character still doesn't know anything about walking.

The character's body plan involved 700 distinct parameters that needed to be optimized to teach it how to walk like a human. "If you look at that system with your human eyes, there's no way you can do it on your own, because the system is just too complex," Reil says. "That's where evolution comes in."

Reil and his team created a genetic algorithm to explore the potential ways that the figure's control system could be refined. The ingredients of a genetic algorithm are actually relatively simple: a population of "organisms," each with a distinct set of "genes"; rules for the mutation and recombination of those genes; and a "fitness function" to evaluate which organisms are the most promising in each generation. In this case, the fitness function was "distance traveled from origin without falling over."

The algorithm generated 100 animated characters, each with a randomly assembled neural network controlling its muscles. Then the algorithm let them all try walking. Predictably enough, the first generation was almost completely inept. But a few figures were slightly better than the rest—they took one hesitant step before crumbling to the ground. By the standards of the fitness function, they became the winners of round one. The software made 20 copies of their neural networks, introduced subtle mutations in each of them, added 80 new participants with randomly wired networks, and started the next generation walking.

Like organic life, genetic algorithms come in two primary flavors: those that feature sex and those that don't. Some algorithms "mate" fitness-function survivors, recombining genes in the process. Others clone the most successful solutions and introduce variation purely through mutations.

Genetic algorithms invariably have surprises. Reil's animations rapidly advanced in their ability to travel without falling, but they didn't always walk. "We got some creatures that didn't walk at all but had these very strange ways of moving forward: crawling or doing somersaults." The creatures were playing by the rules of the game, so Reil had to change the rules. "We had to put in a few exceptions: It's not just distance traveled, it's distance traveled without the center of mass going below a certain point."

Eventually, Reil optimized the procedure to take only 20 generations and a few minutes of computation time. The team created a short time-lapse video that shows sample clips from several generations along the way, including the best walker from generation one (the initial figure flailing on the ground) and ending with the successful striding figure in the 20th generation.

This is one of those situations in which reinventing the wheel is a good thing. Watching the time-lapse video clips, one can't help but marvel how this virtual evolution is roughly analogous to the real-world evolution of our ancestors millions of years ago when they first began to walk upright across the savannas of Africa. The stick figure strides convincingly not because someone engineered it to do so but because an evolutionary process allowed the figure itself to find its way to that distinct pattern of movement and muscular control.

The genetic algorithm doesn't make the computer self-aware in a HAL 9000 kind of way, but it does make the computer genuinely creative, capable of imaginative leaps and subtle connections that might elude the minds of human engineers. And the end result is a useful product, now incorporated into an animation software package called Endorphin.

Reil and his team are not alone in unleashing genetic algorithms on practical tasks. Bill Gross and his team of inventors at Idealab in Pasadena, California, are using genetic algorithms to develop a new solar energy device (see "Catch the Fire," page 52). Gross believes genetic algorithms have the potential to revolutionize engineering. Instead of using software as merely a visualization tool that helps draw a contraption, he envisions genetic algorithms that can handle the entire design process. You define your organism, your genes, and your fitness function and let the software do the hard work of actually figuring it out.

"I think this is the way engineering should be done: Instead of defining your part or your circuit board, define your objective and let the software evolve the answer. Let's say I want a table. Instead of drawing out a table, you say, My constraints are these: I want a plane at this height, with this sideways rigidity, and so on. And then you tell the software, OK, you've got bars, beams, screws, bolts—make the best thing you can at the lowest cost."

Genetic algorithm advocates often talk about their software in the language of ecosystems: predators and prey, species and resources. But Gross has another idea—less rain forest and more assembly line. "Let's say you give the software access to the entire McMaster-Carr industrial supply catalog. They have 400,000 parts in stock: screws, bolts, hinges, everything. So you've got the whole gene pool of those parts available." Somewhere in that mix is the machine you're dreaming of, and simulated evolution may well be the fastest way to find it.

"You state your objectives, let the thing evolve with the optimum combination of parts at the lowest price, and the machine will be there this afternoon," Gross says, his voice rising with excitement. "That's an extreme exaggeration—but not that extreme!"







Natural Motion is the company founded to develop Torsten Reil's Active Character Technology described in the article. Check out its site, where you'll find a brief description of its work and applications and a QuickTime demo of a figure learning to walk: www.naturalmotion.com/pages/technology_hiw.htm.

See for yourself one practical, real-world application of genetic algorithms—a more efficient refrigerator: frontier.enginsoft.it/applicazioni/english/frigorifero_eng.html.

Find out what Steven Johnson is up to at his Web site, where you'll also find links to some of his recent articles, including pieces for Discover: www.stevenberlinjohnson.com.
Comment on this article
ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

ADVERTISEMENT
ADVERTISEMENT
Collapse bottom bar
DSCSeptCover
+

Log in to your account

X
Email address:
Password:
Remember me
Forgot your password?
No problem. Click here to have it emailed to you.

Not registered yet?

Register now for FREE. It takes only a few seconds to complete. Register now »