James Gee, a professor of learning sciences at the University of Wisconsin, was profoundly humbled when he first played a video game for preschool-age kids called Pajama Sam: No Need to Hide When It’s Dark Outside. Gee’s son Sam, then 6, had been clamoring to play the game, which features a little boy who dresses up like his favorite action hero, Pajama Man, and sets off on adventures in a virtual world ruled by the dastardly villain Darkness. So Gee brought Pajama Sam home and tried it himself. “I figured I could play it and finish it so I could help Sam,” says Gee. “Instead, I had to go and ask him to help me.”
Gee had so much fun playing Pajama Sam that he subsequently decided to try his hand at an adult video game he picked at random off a store shelf—an H. G. Wells–inspired sci-fi quest called The New Adventures of the Time Machine. “I was just blown away when I brought it home at how hard it was,” he says. “I thought, ‘You can’t tell me that people go to the store and pay fifty dollars and buy this!’ Then I found out that there are billions spent each year on these games.”
Gee’s scholarly interest was also piqued. He sensed instantly that something interesting was happening in his mind as he struggled to complete the puzzles of The Time Machine. “I hadn’t done that kind of new learning since graduate school. You know, as you get older, you kind of rest on your laurels: You learn certain patterns, you know your field, and you get a lot of experience. But this requires you to think in a new way. I saw that the excitement of this is the challenge and the difficulty and the new learning. That’s what makes it fun!”
Gee’s epiphany led him to the forefront of a wave of research into how video games affect cognition. Bolstered by the results of recent laboratory experiments, Gee and other researchers have dared to suggest that gaming might be mentally enriching. These scholars are the first to admit that games can be addictive, and indeed part of their research explores how games connect to the reward circuits of the human brain. But they are now beginning to recognize the cognitive benefits of playing video games: pattern recognition, system thinking, even patience. Lurking in this research is the idea that gaming can exercise the mind the way physical activity exercises the body: It may be addictive because it’s challenging.
All of this, of course, flies in the face of the classic stereotype of gamers as attention deficit–crazed stimulus junkies, easily distracted by flashy graphics and on-screen carnage. Instead, successful gamers must focus, have patience, develop a willingness to delay gratification, and prioritize scarce resources. In other words, they think.
One of the most popular video games ever created is called Tetris. It involves falling tile-like tetrominoes that a player must quickly maneuver so they fit into space at the bottom of the screen. In the early 1990s, Richard Haier, a professor of psychology at the University of California at Irvine, tracked cerebral glucose metabolic rates in the brains of Tetris players using PET scanners. The glucose rates show how much energy the brain is consuming, and thus serve as a rough estimate of how much work the brain is doing. Haier determined the glucose levels of novice Tetris players as their brains labored to usher the falling blocks into correct locations. Then he took levels again after a month of regular play. Even though the test subjects had improved their game performance by a factor of seven, Haier found that their glucose levels had decreased. It appeared that the escalating difficulty of the game trained the test subjects to mentally manipulate the Tetris blocks with such skill that they barely broke a cognitive sweat completing levels that would have utterly confounded them a month earlier.
Nearly a decade after Haier’s study, Gee hit upon an explanation. He found that even escapist fantasy games are embedded with one of the core principles of learning—students prosper when the subject matter challenges them right at the edge of their abilities. Make the lessons too difficult and the students get frustrated. Make them too easy and they get bored. Cognitive psychologists call this the “regime of competence” principle. Gee’s insight was to recognize that the principle is central to video games: As players progress, puzzles become more complex, enemies swifter and more numerous, underlying patterns more subtle. Most games don’t allow progress until you’ve reached a certain level of expertise.
This is exactly the model of how Tetris works: When you first launch the game, the blocks fall at a leisurely pace, giving you plenty of time to rearrange them as they descend so they’ll fit the spaces where they fall and gradually build up a wall that fills the screen. As you get better at manipulating the blocks, the game starts dropping them at increasing speeds.
To understand why games might be good for the mind, begin by shedding the cliché that they are about improving hand-eye coordination and firing virtual weapons. The majority of video games on the best-seller list contain no more bloodshed than a game of Risk. The most popular games are not simply difficult in the sense of challenging manual dexterity; they challenge mental dexterity as well. The best-selling game of all time, The Sims, involves almost no hand-eye coordination or quick reflexes. One manages a household of characters, each endowed with distinct drives and personality traits, each cycling through an endless series of short-term needs (companionship, say, or food), each enmeshed in a network of relationships with other characters. Playing the game is a nonstop balancing act: sending one character off to work, cleaning the kitchen with another, searching through the classifieds for work with another. Even a violent game like Grand Theft Auto involves networks of characters that the player must navigate and master, picking up clues and detecting patterns. The text walk-through for Grand Theft Auto III—a document that describes all the variables involved in playing the game through to the finish—is 53,000 words long, the length of a short novel. But despite the complexity of these environments, most gamers eschew reading manuals or walk-throughs altogether, preferring to feel their way through the game space.
Gee contends that the way gamers explore virtual worlds mirrors the way the brain processes multiple, but interconnected, streams of information in the real world. “Basically, how we think is through running perceptual simulations in our heads that prepare us for the actions we’re going to take,” he says. “By modeling those simulations, video games externalize how the mind works.”
Among all popular media today, video games are unique in their reliance on the regime of competence principle. Movies or television shows don’t start out with simple dialogue or narrative structures and steadily build in complexity depending on the aptitude of individual viewers. Books don’t pause midchapter to confirm that their readers’ vocabularies have progressed enough to move on to more complicated words. By contrast, the training structure of video games dates back to the very origins of the medium; even Pong got more challenging as a player’s skills improved. Moreover, only a fraction of today’s games involve explicit violence, and sexual content is a rarity. But the regime of competence is everywhere.
Even if Gee is right and games are learning machines, one question remains: Do the skills learned in the virtual world translate into the real one?
Inside the mind of a gamer
Complex video games require far more than simple hand-eye coordination. Splinter Cell: Chaos Theory, the latest installment in a popular Tom Clancy–inspired series, taxes stealth and navigational skills as the player explores huge virtual environments in the guise of an undercover federal agent. To complete the game, you need to think simultaneously on four distinct levels.
1. Manual interface To control the movements and actions of your on-screen character, you must memorize several dozen distinct button combinations on a video console handset or a PC keyboard (far left). That’s a far cry from the simple jump-or-shoot interfaces of primitive arcade-style games.
2. Character view As the game progresses, you take in a shifting landscape of information about the virtual world, such as the sudden appearance of enemies, visual cues that suggest the existence of a puzzle to be solved, and overlaid interface elements that track your character’s health.
3. Internalized map Most games involve exploring vast worlds as you struggle to learn the rules. You must remember all the twists and turns you’ve made, or you’ll get hopelessly lost. Lose your bearings on this giant ship in Splinter Cell: Chaos Theory and your character may end up dead.
4. Balancing act Playing complex games involves juggling multiple objectives, choosing what to prioritize and what to defer. The goals affect decision making on other conceptual levels: which buttons to press, how you interact with other characters, and which areas you choose to explore.
In the spring of 2003, A research assistant in cognitive sciences at the University of Rochester named Shawn Green began helping cognitive science professor Daphne Bavelier with a project investigating visual perception. Contrary to conventional wisdom, Bavelier’s lab had found that people born deaf do not show better-than-average visual skills across the board; instead, they have very specific skills, including the ability to monitor their peripheral field. So Bavelier and Green began developing computerized tests to track these abilities. But a strange thing happened as they worked on the software. When Green took the tests himself, he scored off the charts. “Since I was an avid action video-game player,” he says, “we decided to test the hypothesis that experience with action video games was the origin of the observed differences.”
Green and Bavelier devised an experiment involving a series of quick visual-recognition tests, such as picking out the color of a letter or counting the number of objects on a screen. The study revealed dramatic perceptual differences between gamers and nongamers that were far more pronounced than the differences between hearing and deaf individuals. When Green tweaked the tests to make them challenging enough so the gamers wouldn’t have perfect scores, the nongamers sometimes performed so poorly that their answers might as well have been random guesses. The researchers also debunked the premise that visually intelligent people were more likely to be attracted to video games in the first place. They had a group of nonplayers spend a week immersed in the World War II game Medal of Honor and found that the group’s skills on the visual test improved as well. The evidence was overwhelming: Games were literally making people perceive the world more clearly.
Green did the initial research as part of his honors thesis. After graduation, he and Bavelier continued the study. Nature published the results in 2003. “The learning induced by video-game playing occurs quite fast and generalizes outside the gaming experience,” Green says. “Our tests are quite dull and very unlike gaming itself. They require subjects to perform the same highly specialized task over and over on boring displays using geometrical shapes or letters. There is no character, no plot story, no goal, and no challenge to raise the stakes. But clearly, whatever it is that gamers learn transfers to situations that use different tasks and different stimuli.”
The premise that games teach generalized skills that apply in real-world situations has been corroborated by recent studies. James Rosser, director of the Advanced Medical Technology Institute at Beth Israel Medical Center in New York City, found that laparoscopic surgeons who played games for more than three hours a week made 37 percent fewer errors than their nongaming peers, thanks to improved hand-eye coordination and depth perception. A recent book published by the Harvard Business School Press looked at studies of three distinct groups of white-collar professionals: hard-core gamers, occasional gamers, and nongamers. The research the authors surveyed contradicts nearly all the received ideas about the impact of games. The gaming population turned out to be consistently more social, more confident, and more comfortable solving problems creatively. They also showed no evidence of reduced attention spans compared with nongamers.
The U.S. military has long supported the premise that learning through games can prepare soldiers for the complex, rapid-fire decision making of combat. In 2002 they released their own game, America’s Army, designed to provide a profile of a soldier’s occupational abilities. Recruits can now submit their game scores when they sign up for service, helping establish what Army enlistment brochures tout as “the best possible match between the attributes and interests of potential soldiers and the attributes of career fields and training opportunities.” A growing recognition that game skills carry over into real-world skills has also prompted the establishment of private research teams, including an MIT-sponsored group called the Education Arcade and an international consortium of scholars called the Serious Games Initiative, which are exploring how to incorporate the positive effects of gaming in traditional educational environments.
Steven Johnson's top brain games
Like books or other forms of media, some video games are more cognitively challenging than others. While the relatively mindless shooter games attract a great deal of negative press, many of the most popular games in recent years offer stimulating mental exercise, even if their narratives sometimes leave a little to be desired. The best are enticing worlds of multi-dimensional complexity that test your cognitive agility by forcing you to make open-ended decisions on the fly. Here are a few current games that will give your brain a strenuous workout.
Black & White Games like The Sims or SimCity are often called god games because the player controls the world from above, shaping the destinies of multiple on-screen characters. Black & White takes that idea literally, inviting you to imagine yourself as a powerful god struggling with various other deities to win the devotion of an island population. As the title suggests, the game has a moral dimension: You can be a wrathful deity or a beneficent one.
World of Warcraft Don’t be put off by the folkloric Dungeons & Dragons veneer of the latest massive multiplayer game, in which thousands of players connect to shared virtual environments via the Internet. This is an immensely complicated world, capable of sustaining vast in-game economies and sprawling city developments.
The Sims 2 You thought home economics taught you about grown-up life? Try playing The Sims for a few hours—juggling two jobs and starting a kitchen fire while cooking for the kids—and you’ll get an amazing and amazingly addictive simulation of the challenges involved in running a modern household.
Katamari Damacy This bizarre but captivating Japanese import challenges your visual skills in geometry in three dimensions, with physics added for extra fun: You roll a giant and strangely sticky ball through a cartoon world, collecting objects that you roll over, which in turn alter the ball’s trajectory. It may be the most original video game of the past five years.
In the fall of 2003, two media researchers at the University of Southern California set up a study to look at the patterns of brain activity triggered by violent video games. Peter Vorderer and René Weber booked time on an fMRI machine, loaded a popular game called Tactical Ops on an adjoining computer console, and watched one test subject after another pretend to be part of a special forces team trying to prevent a terrorist attack. Each test subject inserted his head three feet into the cavity at the center of the fMRI, where a small mirror positioned directly above his eyes made it possible to view the computer screen. During the course of the game, the scanner tracked the blood flow to different parts of the brain, creating a map of neural activity.
Before Vorderer and Weber even looked at any of the brain scans, they were surprised by the behavior of the dozen or so adults who volunteered for the test. Participating in an fMRI study involves lying for extended periods of time in an extremely confined and loud space. Even a mild claustrophobic will invariably find the experience intolerable, and most people need a break after 20 minutes. But most of the Tactical Ops players happily stayed in the machine for at least an hour, oblivious to the discomfort and noise because they were so entranced by the game.
The blithe reaction of the Tactical Ops players to being entombed in a cacophonous scanner may prompt some people to jump to a predictable conclusion: Video games are dangerously addictive. But if games challenge the mind as much as this new research suggests, why do people in search of escapist entertainment find them so captivating?
The answer may have to do with the neurotransmitter dopamine. A number of studies have revealed that game playing triggers dopamine release in the brain, a finding that makes sense given the instrumental role that dopamine plays in the way the brain handles both reward and exploration. Jaak Panksepp, a neuroscientist at the Falk Center for Molecular Therapeutics at Northwestern University, calls the dopamine system the brain’s “seeking” circuitry, which propels us to explore new avenues for reward in our environment. The game world is teeming with objects that deliver clearly articulated rewards: more life, access to new levels, new equipment, new spells. Most of the crucial work in game interface design revolves around keeping players notified of potential rewards available to them and how much those rewards are needed.
In a sense, neuroscience has offered up a prediction, one that games obligingly confirm. If you create a system in which rewards are both clearly defined and achieved by exploring an environment, you’ll find human brains drawn to those systems, even if they’re made up of virtual characters and simulated sidewalks. It’s likely those Tactical Ops players in the fMRI machine were able to tolerate the physical discomfort of the machine because the game environment so powerfully stimulated the mind’s dopamine system.
Of course, dopamine is also involved in the addictiveness of drugs. “The thing to remember about dopamine is that it’s not at all the same thing as pleasure,” says Gregory Berns, a neuroscientist at Emory University School of Medicine in Atlanta, who looks at dopamine in a cultural context in his forthcoming book, Satisfaction, due out in September. “Dopamine is not the reward; it’s what lets you go out and explore in the first place. Without dopamine, you wouldn’t be able to learn properly.”
The video game cocktail of sleek technology, dopamine-friendly environments, and sensationalist narratives means that some players end up getting too attached to their joysticks. There’s no denying that some games place far too much emphasis on gratuitous violence, and others on absurd watered-down Tolkien fantasy. If there is a health problem associated with gaming, it’s likely to be the lack of physical exertion that comes from sitting in front of a monitor all day. “The biggest problem we run into, especially from the media, is that everyone wants an answer as to whether a game is good or bad for you just as it stands there on the shelf,” says Gee. “For little kids, Pokémon is a great cognitive developer, if it’s being scaffolded by the parents and they’re getting their kids to talk about it. But if it’s just a passive babysitter, then it’s no good for you.”
I ask Gee what kind of cognitive skills we should expect to find in the Pokémongeneration. Not surprisingly, he’s got a list. “They’re going to think well about systems; they’re going to be good at exploring; they’re going to be good at re-conceptualizing their goals based on their experience; they’re not going to judge people’s intelligence just by how fast and efficient they are; and they’re going to think nonlaterally. In our current world with its complex systems that are quite dangerous, those are damn good ways to think.”
Gee’s remarks remind me of an experience I had a few years earlier, introducing my 7-year-old nephew to SimCity 2000, the best-selling urban simulator that lets you create a virtual metropolis on your computer, building highways and bridges, zoning areas for development, raising or lowering taxes. Based on the player’s decisions, neighborhoods thrive or decline into ghettos, streets get overrun with traffic or remain wastelands, criminals prosper or disappear. When I walked my nephew through the game, I gave him only the most cursory overview of the rules; I was mostly just giving him a tour of the city I’d built. But he was absorbing the rules nonetheless. At one point, I showed him a block of rusted, crime-ridden factories that lay abandoned and explained that I’d had difficulty getting this part of my city to come back to life. He turned to me and said, “I think you need to lower your industrial tax rates.” He said it as calmly and as confidently as if he were saying, “I think we need to shoot the bad guy.”
In a 20-minute tour of SimCity, my nephew had learned a fundamental principle of urban economics: Some areas zoned for specific uses can falter if the zone-specific taxes are too high. Of course, if you sat my 7-year-old nephew down in an urban studies classroom, he would be asleep in 10 seconds. But just like those Tactical Ops players happily trapped for an hour in an fMRI, something in the game world had pulled at him. He was learning in spite of himself.