Distribution of Maximum Earthquake Magnitudes in Future Time Intervals, Application to the Seismicity of Japan (1923-2007)

Distribution of Maximum Earthquake Magnitudes in Future Time Intervals,   Application to the Seismicity of Japan (1923-2007)
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We modify the new method for the statistical estimation of the tail distribution of earthquake seismic moments introduced by Pisarenko et al. [2009] and apply it to the earthquake catalog of Japan (1923-2007). The method is based on the two main limit theorems of the theory of extreme values and on the derived duality between the Generalized Pareto Distribution (GPD) and Generalized Extreme Value distribution (GEV). We obtain the distribution of maximum earthquake magnitudes in future time intervals of arbitrary duration tau. This distribution can be characterized by its quantile Qq(tau) at any desirable statistical level q. The quantile Qq(tau) provides a much more stable and robust characteristic than the traditional absolute maximum magnitude Mmax (Mmax can be obtained as the limit of Qq(tau) as q tends to 1, and tau tends to infinity). The best estimates of the parameters governing the distribution of Qq(tay) for Japan (1923-2007) are the following: Form parameter for GEV = -0.1901 +- 0.0717; position parameter GEV(tau=200)= 6.3387 +- 0.0380; spread parameter for GEV(tau=200)= 0.5995 +- 0.0223; Q_0.90,GEV(tau=10)= 8.34 +- 0.32. We also estimate Qq(tau) for a set of q-values and future time periods in the range for tau between 1 and 50 years from 2007. For comparison, the absolute maximum estimate Mmax from GEV, which is equal to 9.57 +- 0.86, has a scatter more than twice that of the 90 percent quantile Q_{0.90,GEV}(tau=10) of the maximum magnitude over the next 10 years counted from 2007.


💡 Research Summary

The paper presents a refined statistical framework for estimating the distribution of the largest earthquake magnitudes that may occur in future time intervals, and applies this framework to the Japanese seismic catalog spanning 1923‑2007. Building on the method introduced by Pisarenko et al. (2009), the authors exploit the two fundamental limit theorems of extreme‑value theory: the Fisher‑Tippett‑Gnedenko theorem for block maxima and the Pickands‑Balkema‑de Haan theorem for threshold exceedances. By explicitly using the duality between the Generalized Pareto Distribution (GPD) and the Generalized Extreme Value (GEV) distribution, they are able to estimate the GEV parameters directly from the exceedance data, thereby improving the efficiency and robustness of the tail‑estimation process.

The data preparation stage involved cleaning the Japanese Meteorological Agency catalog, removing duplicate entries, and ensuring temporal independence by imposing a minimum inter‑event time. Only events with magnitude Mw ≥ 5.0 were retained for the extreme‑value analysis, and the catalog was treated as a stationary series over the 85‑year observation window. Parameter estimation was performed using maximum likelihood combined with bootstrap resampling to quantify uncertainties. The resulting GEV shape parameter ξ = –0.1901 ± 0.0717 is negative, indicating a Weibull‑type tail with a finite upper bound. The location parameter for a 200‑year block, μ(τ = 200) = 6.3387 ± 0.0380, reflects the typical maximum magnitude expected over a long horizon, while the scale parameter σ(τ = 200) = 0.5995 ± 0.0223 describes the spread of the distribution.

A central contribution of the study is the introduction of the quantile function Qq(τ), defined as the magnitude that will not be exceeded with probability q during a future interval of length τ. This provides a flexible, statistically stable alternative to the traditional absolute maximum magnitude Mmax, which is obtained as the limit of Qq(τ) when q → 1 and τ → ∞. For example, the 90 % quantile for a 10‑year horizon is Q0.90,GEV(τ = 10) = 8.34 ± 0.32, meaning that there is a 90 % chance that the largest earthquake in the next decade will be no larger than magnitude 8.34. By contrast, the Mmax estimate from the same GEV fit is 9.57 ± 0.86, exhibiting more than twice the scatter of the 90 % quantile. This demonstrates that Qq(τ) is far less sensitive to sampling variability and therefore more reliable for risk‑management purposes.

The authors compute Qq(τ) for a range of confidence levels (q = 0.5, 0.75, 0.9, 0.95) and future horizons (τ = 1–50 years). As expected, shorter τ values lead to lower quantiles but larger standard errors, while longer τ values increase the quantile values but reduce uncertainty. These results are presented in tables and plots that clearly illustrate the trade‑off between time horizon and confidence level.

From a practical standpoint, the paper argues that seismic hazard assessments should shift from reporting a single “maximum possible magnitude” to providing a set of quantiles tailored to specific planning horizons and risk tolerances. The negative shape parameter suggests that the Japanese seismic regime possesses a physical upper bound on earthquake size, aligning with geological constraints on fault dimensions and stress accumulation. Moreover, the methodology is readily transferable to other regions and to other natural‑hazard phenomena (e.g., floods, landslides) where extreme‑value theory is applicable.

The study concludes with several recommendations for future work: incorporating non‑stationarity to capture temporal changes in seismicity rates, extending the framework to spatially heterogeneous catalogs, and exploring multivariate extreme‑value models that jointly consider magnitude, depth, and rupture area. By doing so, the predictive power and relevance of extreme‑value based seismic risk assessments can be further enhanced, providing policymakers, engineers, and insurers with more actionable information for long‑term disaster preparedness.


Comments & Academic Discussion

Loading comments...

Leave a Comment