Let’s start with stinkbugs. On August 24, 2003, a fortnight after the temperature in London had climbed above 100 degrees Fahrenheit for the first time in recorded history, D. E. Maggs of Kingswood Avenue, Queens Park, walked into the British Natural History Museum carrying a small glass jar. It contained two specimens of a curious insect she had collected on her tomato plants. She presented them to beetle curator Max Barclay, who identified them as Nezara viridula, the southern green stinkbug. He noted that they were nymphs, meaning they had been born in London. “I thought she was having me on,” Barclay recalls. Stinkbugs are widespread in warmer climes, he explained to Maggs, and had long been known to cross the Channel in crates of Italian produce. But until now they couldn’t reproduce in the tepid English summers. Apparently that changed: Barclay says a new generation of stinkbugs has popped up in various gardens around London.
When our grandchildren write the history of global warming—how we discovered and debated it, and what we finally did about it—the stinkbugs that ate Maggs’s tomatoes may not loom large. Nor will the blue mussels that showed up this past year off Spitsbergen, Norway, at 78 degrees north latitude. Nor even the catastrophic failure of Scottish seabirds to breed, which some researchers attributed to a dearth of plankton in the warming waters of the North Sea. But our descendants may well decide that it was the long string of such close-to-home observations—the early springs, the shifting ranges of plants and animals, the mortal heat waves—that, more than any climatological data, convinced people that something needed to be done about global warming. And maybe, just maybe, those future historians will decide that 2004 was the turning point.
If that sounds optimistic, consider a few of the year’s headlines. The biggest was certainly the decision by the Russian government to endorse the Kyoto Protocol, thus allowing the treaty to take effect and leaving the United States and Australia alone among industrial nations in their refusal to accept limits on greenhouse-gas emissions. Yet even in the United States there was a palpable change in mood—and it was not just because Hollywood made climate disaster into a motion picture. Starting from the same kernel of scientific truth as did The Day After Tomorrow—that global warming could disrupt ocean currents in the North Atlantic—a study commissioned by the Pentagon, of all organizations, concluded that the “risk of abrupt climate change . . . should be elevated beyond a scientific debate to a U.S. national security concern.” A cover story in Business Week, of all publications, urged the need to “get serious about global warming” and remarked pointedly on “the leadership vacuum left by Washington.” And California’s Republican governor, of all people, a notorious Humvee aficionado, vowed to defend his state’s own pioneering limits on carbon dioxide emissions against the girlie men in the automobile industry.
Meanwhile, the tide of scientific evidence continued to roll in. Swiss researchers, looking at everything from ice cores and tree rings to weather records, reported in March that the summer of 2003, which killed tens of thousands of people, was by far the hottest summer in Europe since 1500; the 20th century as a whole was the hottest century. Computer models can’t explain that trend without factoring in a man-made greenhouse effect, but skeptics have long argued that the models also can’t explain why the lower atmosphere has apparently warmed less than Earth’s surface. That argument took a knock in 2004. Reanalyzing the satellite temperature measurements, Quiang Fu of the University of Washington and his colleagues concluded that a cooling in the upper atmosphere had been masking what is in fact a large warming of the lower atmosphere.
A sillier argument was also laid to rest: the one that says global warming might be a good thing because it would protect us from the next ice age. The advance and retreat of ice sheets is paced by cyclical changes in the shape of Earth’s orbit. More than 400,000 years and four glaciations ago, the orbit was about as round as it is now, and the planet was in an interglacial period much like today’s. Last summer a team of European researchers reported the first precise record of that distant time and of the past 740,000 years of climate history. They obtained it by drilling the oldest ice core ever, almost two miles into a godforsaken spot called Dome C—600 miles inland from the Antarctic coast and a little more than 1,000 miles from the South Pole. The result: If that earlier interglacial period is a guide to this one, we have another 15,000 years or so before the ice sheets should start to grow again. Accepting global warming now to forestall global cooling 15 millennia from now, says Eric Wolff of the British Antarctic Survey, with classic understatement, “is not a good bet.”
Another thing that ice core showed, as others have before, is that the great swing in temperature between glacial and interglacial periods was invariably accompanied by great swings in the amount of greenhouse gases in the atmosphere: When the greenhouse goes up, the ice sheets go down. Today we are pushing the carbon dioxide level to a height it last reached 24 million years ago, when there was a lot less ice on Earth and the climate was very different. All over the world, from the Arctic to the Antarctic and from Alaska to the Andes, ice is melting and flowing into the sea. The Intergovernmental Panel on Climate Change projected in 2001 that the sea level will rise by no more than three feet in this century—but that projection assumes the major ice sheets will remain intact.
That’s why the news from the Antarctic this fall was so disquieting. Two years ago, on the east side of the long peninsula that juts up toward South America—where average air temperatures have risen between 3.6 and 7.2 degrees Fahrenheit over the past 50 years—a 1,200-square-mile shelf of floating sea ice called Larsen B suddenly collapsed and drifted out to sea. Last September two teams of American researchers, using data from two different satellites, reported that land-bound glaciers on the peninsula have since slid rapidly toward the coast—because the ice shelf is no longer there to hold them back.
A similar process may be under way in West Antarctica. The ice sheet there—750,000 cubic miles of ice, which, if melted, would raise sea levels more than 16 feet, drowning south Florida—is bound not to land but to the seafloor. Its bottom in most places is well below sea level. That makes it vulnerable to collapse, because seawater can flow in underneath it and transform its edge into a floating ice shelf like Larsen B, which might then break up, freeing the ice behind it. An early sign of this process might be an increased thinning of glaciers along the coast. In September a team of American and Chilean researchers led by Robert Thomas of NASA found that glaciers in the Amundsen Bay region of West Antarctica had thinned by as much as 100 feet in five years. It’s still unlikely we’ll lose Miami before the century is out, but Floridians would do well to follow the news from Antarctica.
Californians, on the other hand, should be watching the snowpack in the Sierra Nevada. Water that falls on the sierra in winter supplies Southern California in summer; the snowpack stores half as much freshwater as all the man-made reservoirs in the state. But because spring now comes sooner, says Daniel Cayan of the Scripps Institution of Oceanography in La Jolla, the snow is already melting days to weeks earlier—and could start running off uselessly into the sea instead of being available when the state needs it most.
Global warming is going to make California’s water problem much worse, Cayan and a team of researchers reported this past year. They used two different climate models, each with a different sensitivity to carbon dioxide, to project California’s future under two scenarios: an optimistic one, in which we only double the level of carbon dioxide in the atmosphere—since the 19th century we’ve already increased it by about a third—and a pessimistic scenario, in which we more than triple CO2.
Even in the optimistic scenario, according to the models, summers in California will be 4 to 9 degrees F hotter by the end of the century than they are now. In the pessimistic scenario they would be anywhere from 7 to 15 degrees hotter, and in Los Angeles, to say nothing of Fresno, there would be months of heat waves—at least three days in a row in the 90s. As for the snowpack, the models show it decreasing by at least 30 percent. If aggressive action to reduce CO2 emissions is not begun, snow could all but vanish from the sierras this century.
It’s a gloomy forecast, but its most important implication is that human choices now can still make a big difference later. The catch is how much later. “That’s one of the tyrannies of climate change,” says Cayan. “It always seems like it’s 20 or 40 years away. So why should I worry?”
In September Cayan and three other researchers testified before a Senate committee chaired by Republican John McCain from Arizona. McCain has cosponsored a bill, so far rejected by his colleagues, that would set up a national system of tradable emissions permits for greenhouse gases and would require U.S. emissions in 2010 to be no more than in 2000—not quite Kyoto, which sets the levels 7 percent below 1990—but a start. “Now the challenge is to update the policy positions to be consistent with the science,” McCain said in opening the hearing. After the scientists had testified, another Republican, Olympia Snowe of Maine, told them: “It always takes the immediacy of the problem to get any reaction here in this institution. We’re not exactly visionary, if you hadn’t noticed.”
Still, it’s only a matter of time before the rising tide of evidence washes over the last islands of resistance in Washington. Stinkbugs have already advanced as far north as Virginia. Pretty soon they should be in the Rose Garden.