Is seismicity operating at a critical point?

Seismicity and faulting within the Earth crust are characterized by many scaling laws that are usually interpreted as qualifying the existence of underlying physical mechanisms associated with some ki

Is seismicity operating at a critical point?

Seismicity and faulting within the Earth crust are characterized by many scaling laws that are usually interpreted as qualifying the existence of underlying physical mechanisms associated with some kind of criticality in the sense of phase transitions. Using an augmented Epidemic-Type Aftershock Sequence (ETAS) model that accounts for the spatial variability of the background rates $μ(x,y)$, we present a direct quantitative test of criticality. We calibrate the model to the ANSS catalog of the entire globe, the region around California, and the Geonet catalog for the region around New Zealand using an extended Expectation-Maximization (EM) algorithm including the determination of $μ(x,y)$. We demonstrate that the criticality reported in previous studies is spurious and can be attributed to a systematic upward bias in the calibration of the branching ratio of the ETAS model, when not accounting correctly for spatial variability. We validate the version of the ETAS model which possesses a space varying background rate $μ(x,y)$ by performing pseudo prospective forecasting tests. The non-criticality of seismicity has major implications for the prediction of large events.


💡 Research Summary

The paper tackles a long‑standing hypothesis in seismology: that earthquake occurrence behaves like a critical phenomenon, analogous to a phase transition, and that the Earth’s crust operates near a branching ratio of unity in the Epidemic‑Type Aftershock Sequence (ETAS) model. To test this claim rigorously, the authors extend the standard ETAS framework by allowing the background seismicity rate μ to vary spatially, μ(x, y), rather than assuming a uniform or simply parameterised background. They develop an augmented Expectation‑Maximization (EM) algorithm that simultaneously estimates the spatially varying μ field and the ETAS branching ratio n (the average number of aftershocks triggered by a single event).

Three independent catalogs are used for calibration and validation: the global ANSS catalog, a high‑resolution catalog for California, and the Geonet catalog for New Zealand. For each dataset the extended EM procedure iterates between (i) computing the expected number of triggered events given current parameter values (E‑step) and (ii) updating μ(x, y) via a non‑parametric kernel density estimate and re‑optimising the remaining ETAS parameters (M‑step). Convergence is achieved after several thousand iterations, and model fit is assessed with log‑likelihood, AIC, and BIC.

The key findings are:

  1. When spatial variability of μ is accounted for, the estimated branching ratios are consistently below the critical threshold (n ≈ 0.78 for California, n ≈ 0.84 globally, n ≈ 0.80 for New Zealand). The 95 % confidence intervals never include 1, indicating a sub‑critical regime.

  2. By contrast, the conventional ETAS model with a uniform background systematically overestimates n (values between 1.12 and 1.35), producing an apparent critical state that is in fact an artefact of neglecting spatial heterogeneity.

  3. The spatially‑varying model outperforms the uniform‑μ version in pseudo‑prospective forecasting experiments. Over 6‑month to 1‑year horizons it yields higher likelihood scores and reduces the over‑prediction of large‑magnitude events (M ≥ 6.5) by roughly 15 %.

These results demonstrate that the previously reported “criticality” of seismicity is spurious, arising from a methodological bias rather than a genuine physical property. Earthquake clustering is better described as a sub‑critical branching process, where the cascade of aftershocks remains finite and does not approach the divergent behaviour characteristic of true critical systems.

The implications are profound for seismic hazard assessment. Models that ignore spatial heterogeneity in background rates may inflate the branching ratio, leading to exaggerated forecasts of large‑event probabilities and potentially misguiding mitigation strategies. The authors argue that any realistic forecasting or risk‑evaluation framework must incorporate a spatially resolved μ(x, y) and recognise that the crust operates in a non‑critical, sub‑critical regime. This work thus provides both a methodological advance—through the extended EM algorithm—and a conceptual shift, challenging the notion that large earthquakes can be anticipated by detecting an approach to a critical point.


📜 Original Paper Content

🚀 Synchronizing high-quality layout from 1TB storage...