We have completed maintenance on DiscoverMagazine.com and action may be required on your account. Learn More

How Our Ancient Brains Are Coping in the Age of Digital Distraction

Our species invented the internet. Can we handle the consequences?

By Kenneth Miller
Apr 20, 2020 1:00 PMApr 22, 2020 12:08 AM
Technology Crazy Brain - Shutterstock
(Credit: Pathdoc/Shutterstock)

Newsletter

Sign up for our email newsletter for the latest science news
 

This story appeared in the May 2020 issue. Subscribe to Discover magazine for more stories like this.


At our very foundation, says cognitive neuroscientist Adam Gazzaley, “humans are information-seeking creatures.” 

And that may be the problem.

While the internet and smart devices give us unprecedented access to the data we covet, we seem clueless about coping with the deluge these technologies have unleashed.

According to a recent survey by the Nielsen market-research group, the average American spends nearly four hours a day on computers and mobile devices — and nearly a quarter of that time on social media. While the upsides of all this pixel-gazing are plentiful, the downsides can be scary. In the public arena, online filters generate bubbles that reinforce our preconceptions and amplify our anger. Brandishing tweets like pitchforks, we’re swept into virtual mobs; some of us move on to violence IRL. Our digitally enhanced tribalism upends political norms and sways elections.

On the homefront, the sound of thumbs tapping screens has replaced dinnertime conversation. Professors face classrooms full of Snapchatting zombies. A 2017 study found that on-the-job smartphone time cost companies $15 billion a week in lost productivity. Texting while driving causes more than 300,000 crashes each year. Hundreds of us are hospitalized annually for walking into things while texting. As our devices grow smarter, more efficient and more connected, they often appear to be making us dumber, more distracted and more divided.

A growing body of research suggests that this conundrum arises from a feature etched into our DNA: our unparalleled hunger to know stuff. “This is an ancient drive that leads to all sorts of complexities in how we interact with the world around us,” says Adam Gazzaley, a neuroscientist at the University of California, San Francisco, and co-author of The Distracted Mind: Ancient Brains in a High-Tech World.

Our current predicament, Gazzaley and other experts suggest, involves the gap between our vast appetite for information and our limited capacity for attention. To grasp how we wound up here — and, perhaps, to find a way out — it’s crucial to understand how we got our brains.

(Credit: Dusan Petkovic/Shutterstock)

The Computer in Our Heads

Neuroscientist Christof Koch of Seattle’s Allen Institute for Brain Science has called the human brain “the most complex object in the known universe.” The computer in our heads contains some 86 billion processing units, known as neurons, woven into a distributed network with hundreds of trillions of connections, or synapses. Over a lifetime, it can store about a billion bits of data: 50,000 times the information in the Library of Congress. It can compose novels and symphonies, figure out how to send spacecraft beyond the solar system, and invent electronic brains whose powers, in some ways, exceed its own. 

Yet this wonder’s origins were strikingly humble. About 7 million years ago, hominins — our branch of the primate family tree — began the long transition to walking upright. Bipedalism, or walking on two legs, freed our hands for making and manipulating tools. It also allowed us to walk longer distances, key to our spread beyond Africa’s forests and savannas. “If you look at nonhuman primates, it’s like they have another set of hands down there,” notes Dean Falk, a professor of anthropology at Florida State University and senior scholar at Santa Fe’s School for Advanced Research, who specializes in brain evolution. “When our feet became weight-bearing instruments, that kicked everything off — no pun intended.”

Not that the effects were immediate. More than 3 million years ago, the braincase of Australopithecus afarensis, likely the first fully bipedal hominin, was only slightly larger than a chimpanzee’s. But by the time Homo sapiens emerged at least 300,000 years ago, brain volume had tripled. Our brain-to-body ratio is six times that of other mammals, and the neurons in our cerebral cortex (the brain’s outer layer, responsible for cognition) are more densely packed than those of any other creature on Earth.

In recent years, scientists have identified about two dozen genetic changes that might have helped make our brains not only bigger but incomparably capable. “It’s not just one quantum leap,” says University of Wisconsin-Madison paleoanthropologist John Hawks. “A lot of adaptations are at play, from metabolic regulation to neuron formation to timing of development.” A stretch of gene-regulating DNA called HARE5, for example, differs slightly between chimps and humans; when a team at Duke University introduced both versions into mouse embryos, the ones that got the human type developed brains that were 12 percent larger. Meanwhile, mutations in a gene called NOTCH2 increase our production of neural stem cells and delay their maturation into cortical neurons, which may be part of the reason our brains keep growing far longer than those of other primates. The FOXP2 gene, crucial for verbal communication in many species, diverges by two base pairs in humans and our nearest living ape relatives. Our mutation may explain why we can talk and chimps can’t. 

Our brains were also shaped by external forces, which increased the odds of smarter hominins passing on their genes. Experts debate which factors mattered most. Falk, for one, hypothesizes that the loss of grasping feet was crucial: When infants could no longer cling to their mothers, as nonhuman primates do, the need to soothe them from a distance led to the development of language, which revolutionized our neural organization. Other researchers believe that dietary shifts, such as eating meat or cooking food in general, enabled us to get by with a shorter digestive tract, which freed up more energy for a calorie-hogging brain. Still others credit our cerebral evolution to growing social complexity or intensifying environmental challenges.

What’s clear is that our neural hardware took shape under conditions radically different from those it must contend with today. For millennia, we had to be on the alert for dangerous predators, hostile clans, potential sources of food and shelter — and that was about it. As McGill University neuroscientist Daniel J. Levitin put it in his book The Organized Mind: “Our brains evolved to focus on one thing at a time.”

Our digital devices, by design, make that almost impossible.

Tech vs. Brain

The part of the brain that enables us to make elaborate plans and carry them through — the part, arguably, that makes us most human — is the prefrontal cortex. This region is only slightly larger in H. sapiens than in chimps or gorillas, but its connections with other brain regions are more extensive and intricate. Despite this advanced network, our planning ability is far stronger than our ability to remain focused on a given task. 

One reason is that, like all animals, we evolved to switch attention instantly when we sense danger: the snapping twig that might signal an approaching predator, the shadow that could indicate an enemy behind a tree. Our goal-directed, or top-down, mental activities stand little chance against these bottom-up forces of novelty and saliency — stimuli that are unexpected, sudden or dramatic, or that evoke memories of important experiences.

(Credit: Rawpixel.com/Shutterstock)

“Many technological devices use bottom-up stimuli to draw our attention from our goals, like buzzes and vibrations and flashes of light,” Gazzaley says. Even when they’re in silent mode, moreover, our devices tempt us with the promise of limitless, immediately available information. The data on tap may be newsy (our least-favorite politician’s latest gaffe), factual (our favorite actor’s filmography), social (the number of upvotes our selfie scored) or just plain fun (that video of the aardvark on a bobsled). But all of it stimulates our hardwired eagerness to be in the know.

This urge isn’t entirely unique to us. In higher primates, brain scans show that neural circuitry originally developed for foraging also governs higher-order cognitive behaviors. Even macaque monkeys respond to new information as they do to primitive rewards like fruit or water. When the animal finds a ripe mango in the jungle — or solves a problem in the lab — brain cells in what’s called the dopaminergic system light up, creating a sensation of pleasure. These cells also build durable connections with the brain circuits that helped earn the reward. By triggering positive feelings whenever these circuits are activated, the system promotes learning. 

Humans, of course, forage for data more voraciously than any other animal. And, like most foragers, we follow instinctive strategies for optimizing our search. Behavioral ecologists who study animals seeking nourishment have developed various models to predict their likely course of action. One of these, the marginal value theorem (MVT), applies to foragers in areas where food is found in patches, with resource-poor areas in between. The MVT can predict, for example, when a squirrel will quit gathering acorns in one tree and move on to the next, based on a formula assessing the costs and benefits of staying put — the number of nuts acquired per minute versus the time required for travel, and so on. Gazzaley sees the digital landscape as a similar environment, in which the patches are sources of information — a website, a smartphone, an email program. He believes an MVT-like formula may govern our online foraging: Each data patch provides diminishing returns over time as we use up information available there, or as we start to worry that better data might be available elsewhere.

The call of the next data patch may keep us hopping from Facebook to Twitter to Google to YouTube; it can also interfere with the fulfillment of goals — meeting a work deadline, paying attention in class, connecting face-to-face with a loved one. It does this, Gazzaley says, in two basic ways. One is distraction, which he defines as “pieces of goal-irrelevant information that we either encounter in our external surroundings or generate internally within our own minds.” We try to ignore our phone’s pings and buzzes (or our fear of missing out on the data they signify), only to find our focus undermined by the effort.

The other goal-killer is interruption: We take a break from top-down activity to feed our information munchies. The common term for this is multitasking, which sounds as if we’re accomplishing several things at once — working on the quarterly report, answering client emails, staying on top of the politician’s gaffe count, taking a peek at that aardvark. In truth, it means we’re doing nothing well.

“There’s a conflict between what we want to do and what we’re actually capable of doing,” Gazzaley says. “With each switch [of our attention from one task to another], there’s a cost.” For example, one study found that it took 25 minutes, on average, for IT workers to resume a project after being interrupted. Besides putting a major crimp in efficiency, such juggling can lead to high levels of stress, frustration and fatigue. 

It also wreaks havoc on working memory, the function that allows us to hold a few key bits of data in our heads just long enough to apply them to a task. Multiple studies have shown that “media multitasking” (the scientific term for toggling between digital data sources) overloads this mental compartment, making us less focused and more prone to mistakes. In 2012, for instance, Canadian researchers found that multitasking on a laptop hindered classroom learning not only for the user but for students sitting nearby. Heavy media multitasking has been associated with diminished cognitive control, higher levels of impulsivity and reduced volume in the anterior cingulate cortex, a brain region linked with error detection and emotional regulation.

Us vs. Them

Emotional regulation is central to another of tech’s disruptive effects on our ancient brains: exacerbation of tribal tendencies. Our distant ancestors lived in small nomadic bands, the basic social unit for most of human history. “Groups that were competing for resources and space didn’t always do so peacefully,” says paleoanthropologist Hawks. “We’re a product of that process.”

These days, many analysts see tribalism asserting itself in the resurgence of nationalist movements worldwide and the sharp rise in political polarization in the U.S., with both trends playing out prominently online. A study published in the American Journal of Political Science in 2015 found that party affiliation had become a basic component of identity for Republicans and Democrats. Social media, which spurs us to publicly declare our passions and convictions, helps fuel what the authors call “the gradual encroachment of party preference into nonpolitical and hitherto personal domains.”

And we’re hardwired to excel at telling “us” from “them.” When we interact with in-group members, a release of dopamine gives us a rush of pleasure, while out-group members may trigger a negative response. Getting online “likes” only intensifies the experience.

(Credit: Monster Ztudio/Shutterstock)

Our retreat into tribal mode may also be a reaction to the data explosion that the web has ignited. In 2018, in the journal Perspectives on Psychological Science, psychologist Thomas T. Hills reviewed an array of earlier studies on the proliferation of information. He found that the upsurge in digitally mediated extremism and polarization may be a response to cognitive overload. Amid the onslaught, he suggested, we rely on ingrained biases to decide which data deserve our attention (see “Tribal Tech” sidebar). The result: herd thinking, echo chambers and conspiracy theories. “Finding information that’s consistent with what I already believe makes me a better member of my in-group,” Hills says. “I can go to my allies and say, ‘Look, here’s the evidence that we’re right!’ ”

In some cases, a bias in favor of one’s own tribe can spur a desire to see another tribe suffer. “Not all out-groups are equivalent,” says Harvard University psychologist Mina Cikara, who studies the factors that make one group take pleasure in another’s pain, a response known as schadenfreude. “Americans don’t react to Canadians, say, the way they do to people from Iran.” The factors driving this type of ill will, she explains, are “a sense that the group is against us, and that they’re capable of carrying out a threat.” For example, when Red Sox and Yankees fans watch their rival team fail to score, even against a third team, they show heightened activity in the ventral striatum, a brain region associated with reward response.

It’s surely no coincidence that during the 2016 presidential election, Russian hackers focused largely on convincing various groups of Americans that another group was out to get them. But foreign agents are hardly the top promoters of tribalism online. As anyone who’s spent time on social media knows, there’s plenty of homegrown schadenfreude on the web.

Present vs. Future

Don’t expect Silicon Valley honchos to redesign their profitable products to be less exploitative of our old-school neural wiring. “The genie is out of the bottle,” says Gazzaley. “Putting it back is not a realistic plan.” 

We can, however, evolve. The surest way to combat digital tribalism, Hills suggests, is to be wary of bias, embrace critical thinking and encourage others to do the same. Gazzaley, for his part, offers a variety of strategies for making our brains less vulnerable to distraction and interruption, and for modifying our behavior to tune out tech’s temptations (see “Taming Our Tech” sidebar). “By building healthier habits, we can change our relationship with technology for the better,” he says. “We’re a very adaptive species. I think we’ll be OK.”  



Tribal Tech

(Credit: Sam Wordley/Shutterstock)

Faced with tech’s cognitive overload, humans determine what’s worthy of attention by relying on biases shaped by evolution, says Thomas T. Hills, a professor of psychology at England’s University of Warwick. Those tendencies may have helped our ancestors survive, but they’re not always in our best interests today, Hills says. He identifies four types of “cognitive selection” that fuel digital tribalism.

Selection for belief-consistent information. Also called confirmation bias, it inclines us to prefer data that align with what we already think. In prehistoric times, this might have led people to see a rainstorm as proof of a shaman’s power over the weather — an interpretation that strengthened social cohesion, even if it was wrong. Today, confirmation bias can lead to more consequential errors, such as seeing a cold snap as proof that climate change is a hoax.

Selection for negative information. This tendency, also known as negativity bias, primed our ancestors’ brains to prioritize alertness for predators over other, less threatening types of attention. Today, it can lead us to privilege bad news over good — for example, by taking a single horrific crime by an out-group member more seriously than data showing that the group as a whole is law-abiding.

Selection for predictive information. Pattern-recognition bias, as it’s often called, helps us discern order in chaos. Noticing that large prey animals tended to arrive in the savanna after the first summer rains would have given early humans an evolutionary advantage. Today, however, a predilection for patterns can lead us to detect conspiracies where none exist.

Selection for social information. This “herd bias” prompts us, in uncertain environments, to follow the crowd. Back in the day, “if everyone else in your tribe was running toward the river, they probably had a good reason,” says Hills. But if everyone in your Reddit community says a famous politician is running a child-sex ring from the basement of a pizzeria, well, it would be wise to visit a fact-checking website before making up your mind.



Taming Our Tech

(Credit: Evgeny Atamanenko/Shutterstock)

Neuroscientist Adam Gazzaley suggests two basic approaches to protect our brains from tech’s downsides: enhancing how our neural circuitry functions, and changing our everyday behavior. While some tactics can be mastered by anyone, others remain experimental.

Resisting the Siren Call

These methods aim to improve our brains’ ability to ignore distractions and recover from interruptions.

  • Education. Researchers are developing a variety of classroom curricula designed to strengthen cognitive control — the capacity to stay on task, even under challenging conditions.

  • Neurofeedback. Introduced in the 1960s, this technique teaches practitioners to control their brainwaves with the help of a brain-computer interface. Used with some success to treat disorders such as ADHD and anxiety, a few small studies have linked the method to improvements in attention and working memory.

  • Nature. A growing body of research suggests that getting outside can help reset weary brains.

  • Cognitive exercises. Clinical trials indicate that some mental exercises, including specially designed video games, can improve focus and resistance to distraction. Evidence for the efficacy of commercially available “brain games,” however, remains sketchy.

  • Meditation. Multiple studies suggest that meditation can enhance attention, memory and processing speed.

  • Physical exercise. A large body of research shows that aerobic activity bolsters the brain’s agility and resilience.

Everyday Evolution

These evidence-based behavior modifications lessen the temptations of tech by limiting its easy appeal and accessibility.

  • While driving, talk to a passenger, listen to an audiobook or enjoy music (all less distracting than phone conversations or texting). Set expectations with friends, family and colleagues that you will not use your phone while on the road, except in true emergencies.

  • While working, limit yourself to a single screen, and put away all nonessential work materials on your desk. Decide which programs or apps you need to complete a task, and close all others. Avoid using tabs; when you’re finished with a website, shut it down. Shut down email, too, and check electronic correspondence and social media only at designated times. A variety of apps can block access to sites to keep you from cheating. Silence your smartphone; if you still feel the pull, move it to another room. Take frequent breaks to reboot your brain; go for a walk or just stare into space and daydream.

  • While hanging out with friends or family, ask everyone present to turn off their phones. If that’s too much, try using “tech breaks,” allowing each person to check their phone briefly every 15 minutes. Make certain areas device-free zones — especially the dinner table and the bedroom. But watching TV or playing video games together, Gazzaley says, can actually build closeness.

Adapted from The Distracted Mind: Ancient Brains in a High-Tech World, by Adam Gazzaley and Larry D. Rosen. The MIT Press, 2016.



1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.