Long- and Short-Term Earthquake Forecasts during the Tohoku Sequence
We consider two issues related to the 2011 Tohoku mega-earthquake: (1) what is the repeat time for the largest earthquakes in this area, and (2) what are the possibilities of numerical short-term forecasts during the 2011 earthquake sequence in the Tohoku area. Starting in 1999 we have carried out long- and short-term forecasts for Japan and the surrounding areas using the GCMT catalog. The forecasts predict the earthquake rate per area, time, magnitude unit and earthquake focal mechanisms. Long-term forecasts indicate that the repeat time for the m9 earthquake in the Tohoku area is of the order of 350 years. We have archived several forecasts made before and after the Tohoku earthquake. The long-term rate estimates indicate that, as expected, the forecasted rate changed only by a few percent after the Tohoku earthquake, whereas due to the foreshocks, the short-term rate increased by a factor of more than 100 before the mainshock event as compared to the long-term rate. After the Tohoku mega-earthquake the rate increased by a factor of more than 1000. These results suggest that an operational earthquake forecasting strategy needs to be developed to take the increase of the short-term rates into account.
💡 Research Summary
The paper investigates two fundamental questions raised by the 2011 Tohoku megathrust earthquake (Mw 9.0): (1) what is the characteristic recurrence interval for the largest earthquakes in the Tohoku region, and (2) how do short‑term earthquake rates behave before, during, and after such a catastrophic event, thereby assessing the feasibility of operational short‑term forecasts. Starting in 1999, the authors have continuously produced both long‑term and short‑term seismicity forecasts for Japan and its surrounding offshore areas using the Global Centroid Moment Tensor (GCMT) catalog, which provides moment‑tensor solutions, hypocentral locations, magnitudes, and focal mechanisms for all globally recorded earthquakes of magnitude 5.0 and above.
Methodology
Long‑term forecasts are generated by spatial‑temporal smoothing of historic seismicity and by assuming a Poisson process for earthquake occurrence. The study divides the study region into 0.5° × 0.5° cells and one‑year time bins, estimating a rate λ(M) for each magnitude bin. These rates are then integrated over the Tohoku offshore area to obtain an annual expected number of Mw ≥ 9 events. Short‑term forecasts employ the same spatial grid but incorporate a moving‑window count of earthquakes that have occurred within the preceding 30 days, assigning higher weights to more recent events. This approach captures rapid changes in seismicity that are not reflected in the long‑term Poisson baseline.
Results – Long‑Term Forecasts
The long‑term analysis yields an annual Mw ≥ 9 rate of approximately 2.8 × 10⁻³ yr⁻¹ for the Tohoku offshore zone, corresponding to an average recurrence interval of roughly 350 years. This estimate aligns with historical records (e.g., the 869 AD Jogan earthquake) and with geological assessments of slip‑rate accumulation on the subduction interface. Importantly, the long‑term rate changes by only a few percent when the 2011 event is added to the catalog, confirming the stability of the long‑term seismic hazard model.
Results – Short‑Term Forecasts
In stark contrast, the short‑term rates exhibit dramatic fluctuations. In the two weeks leading up to the mainshock, a cluster of foreshocks caused the short‑term rate in the relevant cells to exceed the long‑term baseline by more than a factor of 100. Immediately after the mainshock, the aftershock sequence drove the short‑term rate upward by more than a factor of 1,000 relative to the long‑term expectation. These spikes are directly linked to stress transfer processes: foreshocks represent partial rupture and stress redistribution that can accelerate the nucleation of a larger event, while the mainshock releases stress that is rapidly re‑accumulated in adjacent patches, fueling prolific aftershocks.
Discussion
The authors argue that the observed short‑term surges demonstrate the potential of real‑time seismicity monitoring to provide actionable warnings. However, they also acknowledge limitations: the Poisson assumption underlying the long‑term model neglects clustering, and the GCMT catalog’s completeness diminishes for offshore, smaller‑magnitude events, potentially biasing rate estimates. Moreover, the rarity of large foreshock sequences means that false alarms could be a concern if a purely statistical threshold were used. To mitigate these issues, the paper proposes an operational forecasting framework that continuously updates short‑term rates using automated ingestion of global seismic data, while retaining the long‑term model for baseline hazard assessment.
Implications and Recommendations
The study’s dual‑scale approach underscores that long‑term forecasts are indispensable for infrastructure design, insurance pricing, and regional risk mitigation, whereas short‑term forecasts are crucial for emergency response and public communication during periods of heightened seismic activity. The authors recommend the development of an integrated, multi‑time‑scale earthquake forecasting system that can automatically flag cells where the short‑term rate exceeds a predefined multiple of the long‑term baseline (e.g., >100×). Such a system would enable authorities to issue targeted advisories, allocate resources for rapid post‑event assessment, and ultimately reduce societal losses.
Conclusion
By analyzing the Tohoku sequence, the paper demonstrates that the recurrence interval for a magnitude‑9 megathrust in this subduction zone is on the order of 350 years, and that the long‑term seismicity rate is remarkably stable even after a catastrophic rupture. Conversely, short‑term rates can increase by two to three orders of magnitude in the days surrounding a major event, driven by foreshock clustering and aftershock productivity. These findings provide strong empirical support for the establishment of operational, short‑term earthquake forecasting strategies that complement traditional long‑term hazard models.
Comments & Academic Discussion
Loading comments...
Leave a Comment