Last week, the media went big with a terrifying-sounding study from NASA's Jet Propulsion Lab, which put the probability of a magnitude-5.0 earthquake in the Los Angeles area before 2019 at 99.9%. Ninety-nine point nine!
That number alone is enough to raise the hackles of nearly all Earth scientists. There are few things that certain in this world — even next-day weather reports are rarely so exact — and earthquakes are notoriously unpredictable. Not in the sense of "tricky to predict" but in the sense of "we cannot predict them."
Did JPL scientists invent new technology, or some new method, that could justify that number? No, they did not. I wish they had — such a tool would have great value — but they didn't, and Californians should recognize this prediction for what it is: an extremely precise wild guess. The JPL study is best understood as an experiment that's not ready for prime time.
For their paper, the JPL scientists relied on GPS records and a specialized type of radar imagery, which over time can give a broad picture of how the Earth is deforming. Both of these data sets show movement not just along known surface faults but deformation of the ground from subsurface faults we may or may not know about. Many studies around the world have used such data for decades to try to tease out what faults are doing beneath our feet.
The broad picture can tell us what is going where and how much energy may be accumulating in faults. What we don't know — and this is crucial — is how much energy can be stored and how much is too much. Looking at a fault in this way is a bit like watching a battery charge, except we don't know when we will get to 100% (the earthquake), or even if 100% is the relevant figure (maybe 90% or even 80% is sufficient for an earthquake). Adding yet more complication, the Los Angeles basin has hundreds of faults all interconnected and charging at different rates.
Because we can't know where each fault is in its earthquake cycle (at least not yet), we have to fall back on history to try to forecast the probability of an earthquake in the future. Due to past behavior, we know that earthquakes are more probable in some regions than others, much like we know that the chance of snow in August is low, and much higher in December.
To forecast with real exactitude, though, a tremendous amount of data is necessary — far more data than is available for most faults, including the ones in the Los Angeles area. To fill in the gaps, as it were, the JPL scientists used a global, as opposed to site-specific, model of how earthquakes might occur over time, based on past earthquake sizes and how many of them have occurred around the world since seismographs were invented. The JPL paper claims the global model can be used to predict roughly how many earthquakes of a given size should occur in a given place.
Here's the crux of the problem: There is no solid reason to think that any particular fault should follow a global pattern. There are many spots around the world that don't fit the worldwide averages very well.
In studying earthquakes, scientists have to contend with uncertainties piled on top of uncertainties. It's one thing to say Los Angeles is overdue for an earthquake; that might very well be true. It's another thing entirely to say there's a 99.9% chance of a major seismic event in the next four years.
There's nothing wrong, of course, with testing theories, but yet another problem with the 99.9% prediction is that it's hard to verify, or falsify.
If we have a big earthquake in the next three years, will that mean the JPL model is a good one? Not necessarily. If there isn't one, does that mean the model is worthless? Not necessarily.
Los Angeles has many faults and many earthquakes, and we just don't have enough information to say exactly what's going to happen. That said, preparing for the one that will come, sooner or later, is a 100% good idea.
Chris Goldfinger is director of the Active Tectonics and Seafloor Mapping Laboratory at Oregon State University.
MORE ON EARTHQUAKES: