How We'll Predict the Next Natural Disaster

Advances in natural hazard forecasting could help keep more people out of harm’s way.

By April Reese|Thursday, July 28, 2016
Seismic instruments are tested in the underground Albuquerque Seismological Laboratory — shown here in a 360-degree panorama — where they can be isolated from wind, temperature and magnetism.
Kelly Holcomb/Incorporated Research Institutions for Seismology

On the far southeastern edge of Albuquerque, N.M., where the Sandia Mountains rise from the Rio Grande Valley, a red door marks the entrance to an underground chamber carved into the mountainside. The door, just uphill from the Albuquerque Seismological Laboratory, leads into a cave that looks like the villain’s lair in a James Bond film: the uneven stone walls painted white, an array of shiny instruments strewn about. Some are embedded in the floor; boxes of other devices sit stacked on a metal table, ready to dispatch to hot spots throughout the world. These are the geologic stethoscopes seismologists use to detect the planet’s shakes and tremors.

“Our equipment has really evolved,” says seismologist David Wilson, who oversees the Albuquerque lab. “It’s pretty high-tech now.” So much so that last year, Ukraine was wary of installing the instruments for fear that “we’re not just recording earthquakes — that we’re keeping tabs on them somehow,” he says.

These instruments are part of the Global Seismological Network, an international system of earthquake sensors. The network is the world’s ear to the ground, designed to pick up at least some of the more than 500,000 earthquakes that occur each year. The lab, which consists of the Albuquerque facility and a monitoring center in Golden, Colo., oversees about 200 stations in 80 countries.

Natural hazard forecasting is a complex science, but whether the target is an earthquake, landslide, hurricane, tornado or flood, the goal is simple: to figure out where and when the next one will hit. Researchers analyze a mind-warping array of data that constantly stream from the sky, ocean and earth, captured by everything from satellites to drones. The advent of Doppler radar in the 1950s gave meteorologists new powers to “read” the air and transformed how they forecast tornadoes and hurricanes. Today, better simulation models and instruments and new research into geophysical and atmospheric dynamics are ushering in a new era of natural hazard forecasting.

“The advances have been tremendous,” says Roger Edwards, a forecaster with the National Weather Service, who has tracked both tornados and hurricanes. “They’ve saved thousands and thousands of lives over the years.”

They come none too soon. As the global population grows, far more people are concentrated in at-risk areas than at any time in Earth’s history.

“Today, not only are more people in harm’s way than there were 50 years ago, but building in flood plains, earthquake zones and other high-risk areas has increased the likelihood that a routine natural hazard will become a major catastrophe,” warns a 2015 report from the Centre for Research on the Epidemiology of Disasters (CRED), which maintains an international disaster database. Between 1994 and 2013, over 6,870 natural disasters claimed nearly 1.4 million lives, according to the report. Death rates from natural disasters rose over that period, reaching an average of more than 99,700 deaths per year, the study found.

Every area of forecasting has its blind spots, and it will probably never be a perfect science, given the sheer complexity of the geosphere. Researchers still don’t fully understand the small but important shifts in storm dynamics that trigger a tornado or hurricane, for instance, and they can’t forecast a hurricane’s intensity. But aided by ever-improving prediction tools and fresh insights into the workings of Earth’s moving parts, natural hazard scientists are closer than ever to demystifying some of the most complex, destructive forces on the planet.



Hundreds of thousands of people died in Haiti’s 2010 earthquake. Port-au-Prince, shown here, was among the hardest hit regions.
Tommy E. Trenchard/Alamy Stock Photo

Earthquakes are the deadliest of natural disasters. Between 1994 and 2013, temblors killed almost 750,000 people — more than all other disasters put together. (That includes fatalities from tsunamis caused by undersea earthquakes.) Seismologists have made great strides in understanding earthquake dynamics and monitoring Earth’s trembles, but they still have much to learn.

WHAT CAUSES THEM: Where tectonic plates meet deep in the earth, they scrape against one another, causing friction and triggering upheaval at the surface.

ADVANCES: Recent updates to the Global Seismographic Network give seismologists a clearer read on activity below Earth’s surface. Instruments placed directly on a fault provide real-time monitoring; in some places, such as Los Angeles, they’re just a few meters apart.

“There have been incredible advances in the development of instruments and the deployment of instruments on active fault zones, which has enabled a very fine-grained, high-resolution study of where earthquakes occur,” says Arthur Lerner-Lam, deputy director of Columbia University’s Lamont-Doherty Earth Observatory.

Scientists now have a much better understanding of the entire earthquake cycle, he adds: “The earth relaxing afterward, the strain building up again — that whole sequence is being torn apart by new instruments.”

The Albuquerque Seismological Laboratory operates the Global Seismographic Network’s 150 monitoring stations spread over 80 countries. The sensor network is so sensitive that it can even detect Earth’s response to the motions of the sun and moon. And this level of precision allows geologists all over the world to keep an eye on our planet’s vibrations, informing cutting-edge research and earthquake monitoring.
Rick Johnson

CHALLENGES: Instrumentation may have advanced, but there are still dead zones, such as the ocean floor. Developing sensors that can beam back data from the deep sea in real time has proved difficult, says Wilson, of the Albuquerque Seismological Laboratory.

And where scientists do closely track seismic activity, they can’t pinpoint exactly when an earthquake will happen. That blind spot became all too clear with the magnitude 7.0 earthquake that wrenched Haiti in 2010, killing between 230,000 and 316,000 people. (Death tolls vary.)

But by studying previous quakes, seismologists can calculate the probability of a future earthquake in the same area. For instance, scientists with the U.S. Geological Survey figure a 63 percent chance of a major earthquake rocking the San Francisco Bay Area in the next 30 years.

Researchers also still don’t fully understand the forces that cause earthquakes, most notably what causes tectonic plates to move. One of the most surprising insights from the latest body of seismic research is that earthquakes can happen in the unlikeliest of places, far from a fault zone. The reasons are not yet clear, says Lerner-Lam.

In New Zealand, scientists are exploring why some faults are more prone to earthquakes than others by studying rocks extracted from deep within the wildly dynamic Alpine fault — the system that formed the mountains in the backdrop of The Lord of the Rings movies. That fault, which rocks and rolls approximately every three centuries, has about a 28 percent chance of quaking in the next 50 years or so.



Officials were criticized for calling the landslide in Oso, Wash., “unforeseen” when it was revealed that weeks of rain had fallen on a slope with over 50 years of known activity.
Ted Warren/Associated Press
Landslides play a key role in shaping landscapes over time, but they can be deadly. A landslide in Oso, Wash., in March 2014 killed 43 people, making it the deadliest in U.S. history. While they tend to cause less damage than other natural hazards because of their relatively smaller reach, landslides occur in an instant, with little opportunity for people to get out of harm’s way.

WHAT CAUSES THEM: Landslides strike when the rock or soil on a slope weakens to the point where it can no longer resist gravity’s pull. They can be triggered by rainfall, erosion or other natural disturbances, such as earthquakes and volcanoes.

ADVANCES: Landslides are among the least understood of all natural hazards. Researchers are studying the site of the Oso landslide to determine how it happened and use that information to identify other at-risk areas. Computer models and landslide simulators — chutelike contraptions into which scientists unleash torrents of mud, water and debris — are yielding new clues about the complex factors that contribute to slope collapse.

Researchers at India’s Amrita University use a landslide simulator to help design early warning systems.
Courtesy of Amrita University Center for Wireless Networks and Applications

CHALLENGES: Uncertainties about landslide dynamics aside, there’s little information on which areas are most vulnerable to slides. Landslide hazard maps cover only about 1 percent of the world’s slopes, according to a recent study led by Fausto Guzzetti, a geologist with the Research Institute for Geo-Hydrological Protection in Perugia, Italy. But new remote-sensing techniques and improved analysis should help fill in those blank spots on the map, the study notes.



A towering ash cloud erupts from Mount Sinabung in Indonesia on Oct. 8, 2014.
Sutanta Aditya
People have lived in the shadow of volcanoes for thousands of years, drawn by their rich soils and picturesque slopes. Today, 800 million people live within 100 kilometers, or about 62 miles, of a volcano. At any given time, a dozen or more volcanoes are in an active state, ranging from minor gas-and-ash plumes to lava flows.

WHAT CAUSES THEM: When magma — molten rock — rises through a vent in Earth’s surface, it is exposed to air. As a result, gas bubbles form, causing pressure to build until the gases and magma are discharged, slowly building a mountain.

ADVANCES: Since volcano monitoring began a century ago, scientists have made significant strides in understanding volcanic behavior, especially in recent years. That’s largely because of advances in seismic sensing and new ways to detect volcanic activity, such as infrasound, which involves listening to seismic waves emanating into the atmosphere. Jeff Johnson, a volcanologist at Boise State University in Idaho, is using this method to help read activity at the Villarrica volcano in Chile.

“It’s yelling at the top of its lungs, with a tone that you and I can’t hear,” he explains. “It’s got this infrasound it produces all the time.” He had placed special microphones around the vent to study how changes in its “voice” related to changes in the lava lake within the volcano. But the experiment was interrupted when Villarrica uncorked in the early hours of March 3, 2014. About 3,000 people in neighboring towns were evacuated.

“I think what we saw at Villarrica is really eye-opening,” Johnson says. “We knew the volcano was in a state of heightened unrest and the dangers were much more elevated, but no one expected things to get out of hand so quickly. Each volcano has its own characteristic style. Learning each particular volcano and learning to understand the signals of that particular system is vital.”

In this case, Johnson theorizes that a buildup of gases destabilized the volcano, reinforcing the need to monitor multiple phenomena at once, including seismic activity, gas flow and heat. “I think enough lava was close to the edge that it sloshed over the rim, and it exposed more gas-charged regions down deeper,” he says.

Boise State University geologist Jeff Johnson gathers data over the edge of the inner crater of Villarrica, an active volcano in Chile. He uses special microphones to study changes in a volcano’s “voice” in hopes of better understanding its behavior.
Richard Sanderson via The National Science Foundation

CHALLENGES: While researchers have studied some volcanic fields for decades, others, such as one beneath Auckland, New Zealand, are poorly understood. Monitoring every volcano near populated areas is a tall order, and there’s no global monitoring system like there is for earthquakes.

”We haven’t really organized that well within the volcano community,” Johnson says. “It’s a bit shameful. A lot of observatories are a little possessive of their volcano.”

Of those that are monitored, 62 volcanoes are deemed high risk because of their proximity to large numbers of people and recent activity. Indonesia, the site of the biggest eruption ever recorded — Mount Tambora in 1815 — is most at risk, with about 78 historically active volcanoes.



A stunning tornado and its dusty funnel cloud move through Wray, Colo., in May.
Dave Crowl
On May 20, 2013, a massive twister ripped through Moore, Okla., killing 24 people and shredding 13,000 homes, schools, farms and businesses. It was one of more than 80 tornadoes in the state that year, and its swift blow was a cruel reminder of the difficulty of forecasting tornadoes, which form very quickly.

WHAT CAUSES THEM: Tornadoes occur when huge thunderstorms known as supercells are turbocharged with churning columns of air. When winds high and low in the column blow at different speeds, they create wind shear, causing the mass of air to spin. If the column is snagged in a supercell updraft, funnel-shaped clouds form.

ADVANCES: Tornado prediction requires complex computer modeling that can take into account the small shifts in storms that can send one whirling into a tornado. But the data going into the model are limited. For instance, typical weather stations, which measure wind speeds, temperature and humidity, can be far apart and only cover so much territory.

Scientists with the University of Massachusetts Amherst Engineering Research Center for Collaborative Adaptive Sensing of the Atmosphere came up with an innovative solution: Why not install dense networks of small radar devices on rooftops and towers? Since they’re closer to the ground, these networks, which are still in the trial stage, can pick up weather shifts that other systems miss.

With distributed radar added to meteorologists’ toolbox, the average 16-minute warning time for a tornado could improve significantly.

Engineers make their final inspections on a radar device ahead of installation in tornado country near Fort Worth, Texas.
CASA Engineering Research Center/University of Massachusetts Amherst

CHALLENGES: Scientists have more data and better models, but the best forecasts still rely on getting that info to the public in a way that compels action. Many people don’t know the difference between a watch — where a tornado is possible — and a warning — where one is on the ground. Forecasters must now balance data overload with communicating threats across many platforms.



Nineteen named storms hit the U.S. in 2010. Twelve of them became hurricanes, tying for the third most active recorded season.
NOAA NESDIS Environmental Visualization Laboratory
Midwesterners can breathe a tornado-sized sigh of relief as twister season comes to an end, but that’s when Atlantic coastal communities brace for peak hurricane season, which lasts midsummer through late fall. Hurricanes are already among the most damaging natural hazards, but as global temperatures rise, they’re expected to become more intense.

WHAT CAUSES THEM: Hurricanes are ocean-bred storms with sustained wind speeds over 65 knots (about 74 mph).

ADVANCES: Meteorologists can now forecast a hurricane two to six days out, giving communities more time to evacuate. One of the biggest advances in recent years is the Coyote drone, a 7-pound unmanned aerial vehicle packed with sensors and a GPS device. Dropped from a plane, it slowly descends through the core of a storm, transmitting real-time data to the National Oceanic and Atmospheric Administration’s Hurricane Research Division. These data will help scientists figure out what’s going on in the center of a forming hurricane, which is poorly understood.

NOAA’s Joe Cione holds a Coyote drone, which his team deploys into storms from aircraft like the one behind him.

CHALLENGES: Forecasting where hurricanes will hit has improved, but meteorologists still can’t predict intensity with any real certainty.



The FLASH system predicted this St. Louis area intersection would see heavy flooding in late 2015.
AP Photo/Jeff Roberson
It’s one of the most common and costly natural hazards: The majority of disasters between 1994 and 2013 were floods, and they affected nearly 2.5 billion people. In one recent incident last winter, flooding in the Mississippi River watershed killed roughly two dozen people and caused widespread power outages.

WHAT CAUSES THEM: Rivers swollen by heavy rains, rising sea levels or storm surges that push seawater into coastal areas.

ADVANCES: Meteorologists can now detect precipitation changes at a smaller scale, making it much easier to forecast flash floods, says Jonathan Gourley, a research hydrologist at the National Severe Storms Laboratory in Norman, Okla.

Rainfall estimates generated by the Multi-Radar Multi-Sensor (MRMS) system are plugged into a system called FLASH, which pairs the MRMS estimates with information about soil type and vegetation. The system models where the water will go and produces updates every few minutes — a key advantage given that some areas can flood very quickly.

Gourley says, “I take the rainfall rates the radar is measuring in the sky, and take it down to the surface and measure what every raindrop is doing on the ground,” whether it meanders through the soil or flows across impervious roads and parking lots and into storm drains and waterways.

This new system — the first to model flash floods in real time — is expected to graduate from demo to full operation soon. When put to the test during a storm in Houston in May 2015, the system demonstrated its worth. “The model highlighted the [at-risk] areas very well, six hours prior to the onset of the flooding,” Gourley says. It provided more accurate information and more lead time than traditional systems that rely on hypothetical forecasts based on things like the flood history of the area, he adds.

One of the most challenging types of flooding to forecast is the influx of water from storm surges. Alan Blumberg at the Stevens Institute of Technology in New Jersey is developing an integrated forecasting system built on the idea that many forecasts are better than one. Rather than relying solely on the National Weather Service’s reports, his system combines regional forecasts from meteorologists around the world.

“I’ll go to my colleagues at Penn State and Rutgers, others who do forecasting, and run those in my model,” he says. “So now I have 125 forecasts. We’re working on how to blend all 125.”

In a six-year, $7 million effort, the Stevens Institute is also developing technology that can predict how storm surge flooding will affect any given street. Researchers there are working with Google to develop a tool that allows people to access that street-level information. “I want to have a picture of how much water is coming into your house,” Blumberg says, “and you can decide what you want to do.”

Colored triangles show stream flow rates in the Eastern United States. Yellow is slow, and purple is high. Zooming in on the St. Louis area (right, inside the rectangle), a cluster of purple triangles predicts flood conditions for the next day.
University of Oklahoma/National Severe Storms Laboratory/NOAA

CHALLENGES: Despite advances in flood forecasting, scientists still can’t join coastal and inland data to stitch together a big-picture assessment of a region, says Gourley. “We don’t have a system to handle inland flooding and coastal storms,” he says. “So if you get a coupled effect, that’s not modeled by anything we have.”

The National Weather Service tried to develop just such a holistic system, called CI Flow, which attempted to combine hydrological data from river basins with coastal storm surge models, but the data load proved too much for the agency’s computing capacity. The National Weather Service has been in discussions with the National Ocean Service to get the program going again, Gourley says. “I think it will be reinvigorated in the next couple of years.”

The European Centre for Medium-Range Weather Forecasts, which has better computer power and more sophisticated modeling than the U.S., shows the difference more processing power can make. The center knew of Superstorm Sandy’s impending approach before the U.S. did. “When Sandy was coming up the coast, they predicted it seven days ahead — the general path — compared to our five days,” Blumberg says. “We can learn a lot from the European Centre.” But it seems unlikely that forecasters will ever outsmart Mother Nature completely, Blumberg says. “The atmosphere is chaotic.”

Comment on this article