New-Generation Earthquake Forecasting Swings into Operation in Italy

Save ArticleSave Article

Failed to save article

Please try again

This article is more than 8 years old.

Italian castle with quake damage
The 12th-century Palazzo Ducale in Bugnara, Italy, suffered roof damage in the L'Aquila earthquake of 2009 (Susan Cardwell/Wikimedia CC)

Scientists are starting to roll out the next generation of earthquake forecasts, based on a smorgasbord of theoretical advances. While California has been using some features of this new approach, Italy is breaking new ground with a system that will issue routine seismic forecasts for the whole country—in technical terms, an operational system. Leaders in this effort explain and defend their approach in two articles in the September issue of the journal Seismological Research Letters (SRL).

The Italian system, now in beta testing, is described in an SRL article by three scientists from the Seismic Hazard Center of the National Institute of Geophysics and Volcanology. It will create products similar to the map below, showing a forecast for earthquakes of magnitude 4 or greater during the first week of 2014.

Italy quake forecast
Probabilities of magnitude-4+ events during the week starting December 31, 2013. The island of Sardinia is grayed out because it isn't included for now; so is the highly active Etna volcano, in Sicily. Notice how small the odds are. (from Marzocchi et al., "The establishment of an operational earthquake forecasting system in Italy," SRL, doi: 10.1785/0220130219)

We're used to weather forecasts that give the odds of rain tomorrow. The Italian operational earthquake forecasts will work essentially the same way. The difference with earthquakes is that on any given day—even any given month or year—the odds of one happening are quite small. Seismologists know that, and the public will have to learn that as well. Once they do, they should be less prone to alarmists, cranks and frauds. This will be a good thing.

Let's take a dramatic example. We all know that big earthquakes have aftershocks. For a few days, earthquakes become hundreds, even thousands of times more likely! But sizeable aftershocks, within one magnitude unit of the mainshock, have odds of roughly 1 percent, and even that's only in the first two or three days afterward.

That doesn't sound like much, and it isn't, but that level of information is still powerful. Consider this: Would people buy a lottery ticket if the state temporarily raised the chance of winning by a hundred times? They probably would, because by analogy that's what they do when the prizes grow large.


But while people notice this kind of thing, they don't overreact, either. Public panic thrives in the absence of official information. The Italian system will need a lot of fine-tuning and review before it's tested on the public. (For instance, the colors on the map look more alarming than they should.) Underlying the operational system are the best quake-prediction models we have, fully open to all users and under continual testing by the Collaboratory for the Study of Earthquake Predictability. It is, in a word, the best science available.

Earthquake scientists have been slow to share this kind of information before. The best research-grade prediction schemes are improving, but they still work only a few times better than chance. But the deadly L'Aquila, Italy, earthquake of April 2009 forced scientists' hands when the official panel of earthquake experts who failed to issue a prediction or a warning at the time were convicted of manslaughter and sentenced to 6 years in jail. (The case is being retried.)

Is it better to share imperfect knowledge with the public, or to avoid panic and misinterpretation by keeping it confidential? The message of L'Aquila was that for better or worse, the science must be shared. An expert panel, the International Commission on Earthquake Forecasting, prepared a comprehensive report on earthquake-prediction science for the Italian government in 2011 that led to the birth of the new Italian system. In a second SRL article, members of that panel led by UCLA's Thomas Jordan argue, "Models that are uncertain and cannot explain everything can still be very useful."

Operational earthquake forecasting puts this statement into action with two essential principles. The first is a transparency principle: "authoritative scientific information about future earthquake activity should not be withheld from the public." One of the most pernicious myths among earthquake paranoids is that the government knows the truth but is hiding it from us. This myth holds enough power to have dragged the Italian seismologists into a manslaughter trial.

The second principle is that "authoritative scientific information about future earthquake activity should be developed independently of its applications to risk assessment and mitigation." Seismologists will be the first to admit that they aren't competent to issue alarms, order evacuations, improve building codes, enforce zoning ordinances, and all of the other useful things that scientific knowledge can contribute to. You might call this a firewall principle, because it frees scientists from the threat of persecution. But I prefer to think of it as an inclusion principle: for society to work best, science must have an equal place at the table along with other authorities.

To scientists, earthquake forecasting is still in its infancy. But operational systems are being built because the rest of us still want the information, even if scientists think it's rudimentary. I think we can learn to handle it, just as we handle weather forecasts. And this way the public can grow in knowledge at the same time as scientists learn.