Forecasting the next great San Francisco earthquake

The Virtual California approach to earthquake forecasting is similar to the computer models used for weather forecasting, said John Rundle, director of the UC Davis Computational Science and Engineering Center, who has developed the model with colleagues from the Jet Propulsion Laboratory and other institutions. A previous forecast of earthquake hazards, the Working Group on California Earthquake Probabilities, used records of past earthquakes to calculate the probability of future ones.

The Virtual California model includes 650 segments representing the major fault systems in California, including the San Andreas fault responsible for the 1906 San Francisco earthquake. The simulation takes into account the gradual movement of faults and how they interact with each other.

In this computer graphic from Rundle's research, colored fringes represent ground movement around faults. (Photo: UC Davis Computational Science and Engineering Center)
The researchers used the model to simulate 40,000 years of earthquakes in California. They found almost 400 major (magnitude 7 or above) earthquakes at an average interval of 101 years. The simulation data indicates a 25 percent chance of another such earthquake in the next 20 years, a 50 percent chance in the next 45 years and a 75 percent chance by 2086.

This site is no longer updated.

Click this link to have updated ecology news and articles.

About the Author
2005 All rights reserved

More articles