A milestone for forecasting earthquake hazards

Researchers replicate latest California earthquake projections with a physics-based model.


Analysts from Columbia University‘s Lamont-Doherty Earth Observatory, University of Southern California, the University of California at Riverside and the U.S. Geological Survey has concocted a material physics-based model that denotes a defining moment in earthquake forecasting.

Earthquake represents a significant threat to individuals and cities worldwide, however with the right hazard-mitigation endeavors, from stricter building necessities to careful zoning, the potential for catastrophic collapses of road and buildings and loss of human lives can be constrained.

Bruce Shaw, a geophysicist at Lamont-Doherty said, “Whether a big earthquake happens next week or 10 years from now, engineers need to build for the long run. We now have a physical model that tells us what the long-term hazards are.”

Reproducing almost 500,000 years of California earthquakes on a supercomputer, analysts could coordinate danger gauges from the state’s leading statistical model in light of a hundred long periods of instrumental data.

The mutually validating results add support for California’s current hazard projections, which help to set insurance rates and building design standards across the state. The results also suggest a growing role for physics-based models in forecasting earthquake hazard and evaluating competing models in California and other earthquake-prone regions.

The earthquake simulator utilized in the examination, RSQSim, streamlines California’s factual model by dispensing with huge numbers of the presumptions that go into assessing the probability of an earthquake of a specific size hitting a particular locale. The analysts were surprised when the simulator, programmed with relatively basic physics, was able to reproduce estimates from a model that has improved steadily for decades.

Seismologists would now be able to utilize RSQSim to test the statistical models’ region-specific forecasts. Precise danger gauges are particularly critical to government controllers in high-hazard cities like Los Angeles and San Francisco, who compose and revise building codes most recent science.

In a state with a severe housing shortage, regulators are under pressure to make buildings strong enough to withstand heavy shaking while keeping construction costs down. A second tool to confirm hazard estimates gives the numbers added credibility.

John Vidale, director of the Southern California Earthquake Center, which helped fund the study, says “the new model has created a realistic 500,000-year history of earthquakes along California’s faults for researchers to explore. Vidale predicted the model would improve as computing power grows and more physics are added to the software. Details such as earthquakes in unexpected places, the evolution of earthquake faults over geological time, and the viscous flow deep under the tectonic plates are not yet built in.”

The researchers plan to use the model to learn more about aftershocks, and how they unfold on California’s faults, and to study other fault systems globally. They are also working on incorporating the simulator into a physics-based ground-motion model, called CyberShake, to see if it can reproduce shaking estimates from the current statistical model.

Their results appear in the new issue of Science Advances.


See stories of the future in your inbox each morning.