September 2, 1859, was a terrible day to be working in the information industry. Telegraph machines around the world behaved as if possessed. They spat out electric shocks and set telegraph paper on fire. Some of the machines continued to send and receive messages even after they were disconnected from the batteries that powered them.
One man saw how it all began. The day before, Richard Carrington, a British brewer’s son who rose to be the foremost solar astronomer of the time, had observed an extraordinary event. He was examining an 11-inch-wide projected image of the sun, part of his routine monitoring of the solar surface, when he noted the eruption of “two patches of intensely bright and white light ... the brilliancy ... fully equal to that of direct sun-light.” Carrington knew he was witnessing an enormous explosion, nearly as bright as all the rest of the sun put together. It was the first time anyone had observed a solar flare, and the first time anyone had seen a solar event produce such tangible consequences on Earth.
Fortunately, those consequences were modest, since the telegraph pretty much defined the beginning and the end of high tech in the middle of the 19th century. If the same event happened today, the story would be drastically different. Flares and the broader solar eruptions associated with them unleash storms of charged particles, emit flashes of energetic X-rays, and temporarily mangle our planet’s magnetic field. Even for regular-strength flares, those effects can fry electronics in space and overload power transformers on the ground.
But the eruption that Carrington witnessed—now known as the Carrington Event—was hardly ordinary. It is now recognized as the most powerful solar storm ever documented. Such superflares seem to occur once every few centuries. Half-Carringtons (which are still terrifyingly potent) strike every half century or so. The most recent one happened in 1960.
A National Research Council panel recently examined the likely impact of a present-day solar superstorm. GPS signals and radio transmissions would be disrupted by the radiation blasting Earth’s upper atmosphere. Communications satellites would malfunction. Most unnerving, the electrical grid in the U.S. (and potentially much of the world) could collapse as transformers overload, leading to a wide-scale blackout that could take 10 years to repair in full. Health care, sanitation, and transportation would be crippled. The Council’s estimated price tag: up to $2 trillion during the first year alone. Or in the words of Ephraim Fischbach, a physicist at Purdue University, “The damage from a solar storm would vastly overwhelm the damage from Hurricane Sandy. Literally millions of people could die.”
The best scientists can do right now is watch the sun for signs of trouble and monitor space weather—the flow of particles and fields—between the sun and Earth. Fischbach has a better idea. He may have figured out a way to forecast the next Carrington Event far enough in advance to allow meaningful action: putting satellites in standby mode, reconfiguring critical services that rely on GPS, shutting down or decoupling key parts of the grid—things that could make the difference between short-term inconvenience and long-term disaster.
Unfortunately, Fischbach’s plan puts him squarely at odds with much of the scientific community.
A Seasonal Link?
Fischbach is the epitome of the curious soul, constantly poking around at the edges of known physics in search of something that other people overlooked: a fifth force of nature, for instance, or a flaw in Einstein’s theory of relativity. About a decade ago he noticed a juicy oddity buried in a pair of overlooked papers, one from Brookhaven National Laboratory, the other from a German measurement institute. The two teams were watching the decay of certain radioactive elements. This is a routine bookkeeping style of research: According to known physics, this type of radioactive decay is a fundamental process that unfolds at an unchanging rate, and all the researchers were aiming to do was to measure that rate and record it for reference. Instead, both teams got a rude surprise. The rate of decay in their samples was not steady but varied according to the season. That fluctuation was small, about 0.3 percent, but it was consistent and—Fischbach noted happily—very, very weird.
Most scientists dismissed the two papers as flukes and moved on. Fischbach decided to take the results at face value. “We had two experiments on two different continents seeing essentially the same thing,” he says. What effect, he wondered, could make a lump of radioactive silicon decay faster or slower?
In conjunction with his Purdue colleague Jere Jenkins, Fischbach realized that the seasonal nature of the variation might provide the crucial clue. Earth follows a slightly oval path around the sun, closest in January and farthest in July. The changes in radioactive decay rate tracked that pattern, rising and falling on cue over the course of the year. It seemed as if it affected the way atoms decayed on Earth, 93 million miles away.