Controversy About Earthquake Prediction
The scientific community continues to question and debate proposed methods of earthquake prediction, and even whether prediction is inherently impossible. Recent views on the overall goal of earthquake prediction can be found at the following web sites:
- What Ever Happened to Earthquake Prediction? by Christopher Scholz, March 1997, Geotimes v. 17.
- Is the reliable prediction of individual earthquakes a realistic scientific goal? An online debate among scientists, hosted by the scientific journal Nature, on their website.
Early scientific efforts toward earthquake prediction in the U.S. were directed primarily toward the measurement of physical parameters in areas where earthquakes occur, including seismicity, crustal structure, heat flow, geomagnetism, electrical potential and conductivity, gas chemistry. Central to these efforts was the concept that a precursor might be observed in one or more of these measurements. However, the connection between a commonly accepted precursor and the earthquake was often speculative and uncertain. A coherent physical model was lacking.
A model on which a scientific prediction could be based began to be developed in the late 1970's and early 1980's, and is described in three seminal papers. Allan Lindh of the USGS, proposed a multi-year, integrated observation program at Parkfield, combining seismic, geodetic, creep, strain, tilt and magnetic measurements with theoretical models of fault mechanics in 1978.
This ornamental water tower advertises both the Parkfield Cafe, and this cottage industry's slogan. Photo by Jennifer Adleman, USGS.
In 1979, W.H. Bakun (USGS) and T.V. McEvilly (University of California at Berkeley) in "Earthquakes Near Parkfield California: Comparing 1934 and 1966 Sequences" developed a model of a "characteristic Parkfield earthquake", which postulates a nearly regular occurrence of earthquakes of similar size that rupture the same part of the fault. In support of this model, they demonstrated a remarkable similarity between seismograms recorded in the 1901 and 1922 earthquakes at Parkfield, and argued that the largest earthquakes at Parkfield since 1857 are consistent with a regular occurrence of one every 22 years. This paper laid the groundwork for the formal prediction that would be made six years later.
In the early 1980's, observations and models for Parkfield led to the development of a formalized method for the prediction of the next Parkfield earthquake. The method and underlying science are described in a USGS Open-file Report and were endorsed by the National Earthquake Prediction Evaluation Council in 1984.
In 1985, in "The Parkfield, California, Earthquake Prediction Experiment", Bakun and Lindh summarized the state of the art in the Parkfield Prediction Experiment, and predicted that a moderate-size earthquake would occur at Parkfield between 1985 and 1993. Their prediction was unusual both in its precision (as to location, time and magnitude) and high degree of confidence (95% within the 9-year window). Bakun and Lindh (1985) also suggested that the predicted earthquake could produce extended rupture of the San Andreas fault to the southeast, possibly growing to magnitude 6.5 to 7.0.