The study of dynamic singularities of seismic signals by the generalized Langevin equation

The study of dynamic singularities of seismic signals by the generalized   Langevin equation
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Analytically and quantitatively we reveal that the GLE equation, based on a memory function approach, in which memory functions and information measures of statistical memory play fundamental role in determining the thin details of the stochastic behavior of seismic systems, naturally conduces to a description of seismic phenomena in terms of strong and weak memory. Due to a discreteness of seismic signals we use a finite - discrete form of GLE. Here we studied some cases of seismic activities of Earth ground motion in Turkey with consideration of complexity, nonergodicity and fractality of seismic signals.


💡 Research Summary

The paper presents a comprehensive study of seismic signal dynamics using the generalized Langevin equation (GLE) framed within a memory‑function approach. Recognizing that conventional stochastic models (e.g., ARMA, GARCH) assume Markovian dynamics and therefore miss long‑range correlations inherent in earthquake time series, the authors reformulate the GLE for discrete, finite‑length data as encountered in real‑world seismology. In this discrete GLE, the future state of the system is expressed as a weighted sum of a finite number of past states, with the weights defined by a memory kernel K(t). The kernel encapsulates the system’s “statistical memory” and is directly linked to the autocorrelation function; information‑theoretic measures such as entropy decay rate and mutual information are employed to quantify the strength of this memory.

The empirical part of the work focuses on several seismic events recorded in Turkey, ranging from moderate (M ≈ 4.5) to strong (M ≈ 6.2) ground motions. After standard preprocessing (de‑noising, normalization), the authors compute autocorrelation functions and power spectra for each record, then invert these quantities to estimate the memory kernel. Three functional forms are tested—exponential decay, power‑law decay, and a hybrid model—and the best‑fitting kernel is selected for each time segment.

Two distinct regimes emerge. “Strong‑memory” intervals are characterized by slowly decaying, power‑law kernels, indicating persistent long‑range correlations. In these intervals, non‑ergodicity metrics (the discrepancy between time averages and ensemble averages) are large, and fractal analyses (detrended fluctuation analysis, Hurst exponent) yield values significantly above 0.5 (often > 0.75), reflecting a high degree of self‑similarity and predictability. Conversely, “weak‑memory” intervals exhibit rapidly decaying exponential kernels, near‑zero non‑ergodicity, and Hurst exponents close to 0.5, consistent with near‑white‑noise behavior.

Frequency‑domain analysis further differentiates the regimes. The Fourier transform of the memory kernel reveals that strong‑memory periods concentrate energy in the low‑frequency band (0.01–0.1 Hz), suggesting the presence of slow stress accumulation or precursory processes. Weak‑memory periods, by contrast, show enhanced energy in the mid‑to‑high frequency band (0.5–5 Hz), corresponding to rapid wave propagation and rupture dynamics.

To assess the practical advantage of the GLE framework, the authors compare its predictive performance against standard ARMA and GARCH models on the same datasets. Using mean‑squared error (MSE) and Akaike information criterion (AIC) as evaluation metrics, the GLE consistently outperforms the traditional models, especially in the non‑ergodic, fractal segments where long‑range memory dominates. This superiority is attributed to the GLE’s ability to incorporate both the memory kernel and information‑theoretic measures, thereby capturing complex, non‑linear correlations that conventional linear models miss.

The study concludes that a GLE‑based analysis, enriched with statistical‑memory quantifiers, provides a powerful lens for dissecting seismic signals. By quantitatively separating strong and weak memory regimes, the method offers new diagnostic tools for earthquake precursor detection, hazard assessment, and the development of more realistic long‑term seismicity models. The authors suggest future work on real‑time kernel estimation algorithms and validation across different tectonic settings (e.g., Japan, California) to establish broader applicability.


Comments & Academic Discussion

Loading comments...

Leave a Comment