When physicist Neil Johnson moved to Maryland in 2018 to take a job at George Washington University (GW), he looked online to see what medical requirements his son would have to meet before entering high school. He quickly stumbled upon many parents having online conversations about tactics they could use to get out of vaccinations — something that had never crossed his mind.
A year later, the U.S. experienced its biggest measles outbreak in almost three decades, with the majority of cases involving people who hadn’t been vaccinated. Johnson — who had, since 2014, been applying tools from physics and math to study the online behavior of terrorist and hate groups — suspected that something interesting, and unsettling, was happening on the web to shape attitudes toward vaccines and advice from the medical establishment in general.
He and his colleagues soon launched an investigation into the matter. In mid-December 2019, while Johnson and company were drafting a report of their findings for publication, they started hearing news of unusual cases of pneumonia breaking out in China. Within the next few weeks, they had broadened the scope of their study to include the debate over COVID-19 vaccinations.
The team’s findings were published today in the journal Nature. Their study focused on 100 million Facebook subscribers who followed more than 1,000 pages that discussed vaccinations from varying perspectives. Johnson’s team created a map that identified all these pages, labeling them with either red, blue or green dots — red signifying an anti-vax message; blue conveying a mainstream, pro-vaccination theme; and green representing curious people who weren’t clearly aligned with either faction.
Creating a Battlefield Map for the Vaccine Fight
Johnson compares the map, which reveals the links between the colored dots (or “clusters,” representing Facebook pages), to a battlefield map. “You never win a battle without a map of the battlefield,” he says, and he believes a battle is being now waged “for the hearts and minds of the undecideds.” And for those who place their faith in reason and the scientific method, the fight is not going well.
“Before we drew the map, we expected to see the Blues — the Centers for Disease Control [and Prevention], the [Bill & Melinda] Gates Foundation, et cetera — at the center of things,” Johnson says, “with the Reds, who represent the ideological fringe, buzzing around the edges.”
But that’s not happening, according to the map. Although the Reds (anti-vaxxers) are a numerical minority, they have formed many more clusters, which in turn forge many more links with the Greens than do the Blues. “The insurgent Reds are completely embedded with the Greens,” Johnson adds, “while the Blues are off on their own, fighting the battle in the wrong place.”
The Reds are making inroads, not only because they have more pages and more connections to the Greens, but also because their pages, which do not focus solely on vaccines, provide “a greater diversity of narratives,” Johnson says. The message from a blue page, such as that coming from the CDC, tends to be “like vanilla, always the same. But red has all these strange flavors we don’t even have a name for. People who are still looking can find what they want — or what they think they need.”
Stemming the Tide of Misinformation
The theoretical model developed by Johnson and his collaborators predicts that anti-vax views will dominate within a decade. In fact, a poll carried out last week found that 19 percent of Americans will refuse to take a COVID-19 vaccine, while 26 percent are undecided — a situation that could amplify outbreaks of the disease, as happened with the measles in 2019. Meanwhile, some protesters in the U.S. and Europe are spreading delusional fantasies, claiming, for instance, that Bill Gates plans to use coronavirus vaccines to inject microchips into the world’s population.
Facebook cannot simply shut down all the “infectious” pages, because the company is obliged to support freedom of speech so long as people are not inciting violence or criminal acts. But Facebook could, Johnson suggests, accord low priority to links that distribute misinformation, which would force people to scroll down a very long way to find them.
The problem, of course, is not limited to Facebook. There’s a growing number of social media platforms out there today — thanks, in part, to open-source software that makes it easy for people to set up their own platforms, which may not be moderated at all. Establishing cooperation between all social media platforms is not realistic, Johnson says, but if pernicious information is passing between a few sites, it might be possible to reach an agreement to inhibit that flow.
He’s currently working with researchers at GW and Google, trying to come up with strategies for impeding the transmission of malicious content within a single platform or from one platform to another. “Now that we have a detailed map that shows all the connections, we can do what-if scenarios,” Johnson says. “If I blocked off this link, what would happen to the flow?”
Ever since he shifted his gaze from traditional problems in physics to studying the online dissemination of violent, racist and otherwise dangerous and distorted views, Johnson has been engaged in a never-ending battle. The fact that his business is booming, he acknowledges, is not great news for the rest of the world.