Condensation of earthquake location distributions: Optimal spatial information encoding and application to multifractal analysis of South Californian seismicity
We present the “condensation” method that exploits the heterogeneity of the probability distribution functions (PDF) of event locations to improve the spatial information content of seismic catalogs. The method reduces the size of seismic catalogs while improving the access to the spatial information content of seismic catalogs. The PDFs of events are first ranked by decreasing location errors and then successively condensed onto better located and lower variance event PDFs. The obtained condensed catalog attributes different weights to each event, providing an optimal spatial representation with respect to the spatially varying location capability of the seismic network. Synthetic tests on fractal distributions perturbed with realistic location errors show that condensation improves spatial information content of the original catalog. Applied to Southern California seismicity, the new condensed catalog highlights major mapped fault traces and reveals possible additional structures while reducing the catalog length by ~25%. The condensation method allows us to account for location error information within a point based spatial analysis. We demonstrate this by comparing the multifractal properties of the condensed catalog locations with those of the original catalog. We evidence different spatial scaling regimes characterized by distinct multifractal spectra and separated by transition scales. We interpret the upper scale as to agree with the thickness of the brittle crust, while the lower scale (2.5km) might depend on the relocation procedure. Accounting for these new results, the Epidemic Type Aftershock Model formulation suggests that, contrary to previous studies, large earthquakes dominate the earthquake triggering process. This implies that the limited capability of detecting small magnitude events cannot be used to argue that earthquakes are unpredictable in general.
💡 Research Summary
The paper introduces a novel “condensation” technique that leverages the heterogeneous probability‑density functions (PDFs) of earthquake hypocenter locations to both compress seismic catalogs and enhance their spatial information content. Traditional catalogs treat each event as a point with an associated scalar error estimate, discarding the full shape of the location uncertainty. In contrast, the authors first rank all events by decreasing location error (standard deviation) and then iteratively transfer the probability mass of the most uncertain events onto better‑located events whose PDFs overlap with them. The transfer weight is proportional to the degree of overlap and inversely proportional to the variance of the target event, ensuring that total probability is conserved. After each transfer, the source event’s weight diminishes; when its weight falls below a preset threshold it is removed from the catalog. The result is a reduced catalog in which each remaining event carries a weight reflecting how much of the original probability it now represents.
Synthetic tests on fractal point sets perturbed with realistic location errors demonstrate that condensation reduces the Shannon entropy of the catalog while preserving the underlying spatial scaling. The authors show that the condensed catalog yields clearer correlation functions and more robust estimates of the generalized dimensions Dq, especially at small scales where original location noise would otherwise mask fractal structure.
Applying the method to the Southern California Seismic Network (SCSN) catalog (≈150 k events, 2000–2015) reduces the number of entries by about 25 % to ≈112 k. High‑weight events line up precisely with known major faults such as the San Andreas, San Jacinto, and Elsinore systems, confirming that the condensation preserves the dominant tectonic geometry. Moreover, medium‑weight events highlight previously unmapped lineaments, suggesting the presence of subsidiary or blind faults that are not evident in the uncondensed data.
The authors then perform a multifractal analysis using the box‑counting method and the spectrum of generalized dimensions Dq (q = 0–5). Both the original and condensed catalogs exhibit two distinct scaling regimes separated by transition scales near 2.5 km and 15 km. The upper transition is interpreted as the thickness of the brittle crust in Southern California, while the lower transition likely reflects the minimum inter‑event distance imposed by the double‑difference relocation algorithm. In the condensed catalog the small‑scale regime shows a lower fractal dimension (≈1.2) compared with the original catalog (≈1.9), indicating that condensation sharpens the detection of fine‑scale clustering.
Finally, the paper integrates the condensed catalog into an Epidemic Type Aftershock Sequence (ETAS) model. Parameter estimation reveals that the productivity exponent α is essentially equal to the Gutenberg‑Richter b‑value, implying that large earthquakes dominate aftershock generation. This contrasts with earlier studies that, using uncondensed catalogs, inferred a substantial contribution from numerous small events. The authors argue that detection thresholds for low‑magnitude events artificially inflate the apparent role of small earthquakes, and that the condensation method mitigates this bias, leading to a more physically realistic picture of seismic triggering.
In summary, the condensation method provides a mathematically rigorous way to incorporate full location‑uncertainty information into point‑based spatial analyses. It yields a compact yet information‑rich catalog, improves the reliability of multifractal scaling estimates, and refines statistical models of earthquake triggering. The approach is broadly applicable to any seismic network where heterogeneous location errors are present, and it opens avenues for more accurate hazard assessment, real‑time monitoring, and integration with machine‑learning pipelines.
Comments & Academic Discussion
Loading comments...
Leave a Comment