Continuous mixtures of Gaussian processes as models for spatial extremes
Spatial modelling of extreme values allows studying the risk of joint occurrence of extreme events at different locations and is of significant interest in climatic and other environmental sciences. A popular class of dependence models for spatial extremes is that of random location-scale mixtures, in which a spatial “baseline” process is multiplied or shifted by a random variable, potentially altering its extremal dependence behaviour. Gaussian location-scale mixtures retain benefits of their Gaussian baseline processes while overcoming some of their limitations, such as symmetry, light tails and weak tail dependence. We review properties of Gaussian location-scale mixtures and develop novel constructions with interesting features, together with a general algorithm for conditional simulation from these models. We leverage their flexibility to propose extended extreme-value models, that allow for appropriately modelling not only the tails but also the bulk of the data. This is important in many applications and avoids the need to explicitly select the events considered as extreme. We propose new solutions for likelihood inference in parametric models of Gaussian location-scale mixtures, in order to avoid the numerical bottleneck given by the latent location and scale variables that can lead to high computational cost of standard likelihood evaluations. The effectiveness of the models and of the inference methods is confirmed with simulated data examples, and we present an application to wildfire-related weather variables in Portugal. Although not detailed here, the approaches would also be straightforward to use for modelling multivariate (non spatial) data.
💡 Research Summary
This paper addresses the growing need for flexible spatial extreme‑value models that can capture both the tail behavior and the bulk of environmental data. Classical approaches based on max‑stable or generalized Pareto processes are limited because they focus exclusively on extreme thresholds, impose rigid tail dependence structures, and suffer from computationally intensive likelihood inference. To overcome these drawbacks, the authors introduce a general class of Gaussian location‑scale mixtures defined by
(X(s)=S+R,W(s),; s\in\mathcal S,)
where ({W(s)}) is a standardized Gaussian process with correlation function (\rho), and (S) (location) and (R\ge0) (scale) are independent latent random variables. By allowing (S) and (R) to follow a wide range of distributions—exponential, asymmetric Laplace, gamma, inverse‑gamma, GPD, etc.—the model can simultaneously generate asymmetric marginal distributions, heavy tails, and a spectrum of extremal dependence ranging from asymptotic independence (AI) to asymptotic dependence (AD).
The paper first reviews theoretical properties of these mixtures, focusing on the extremal dependence coefficients (\chi) and (\bar\chi). It shows how the choice of the distribution of (R) controls the strength of upper‑tail dependence (through (\chi)), while the distribution of (S) governs asymmetry between upper and lower tails. Table 1 summarizes a suite of specific constructions, including well‑known cases (Gaussian, Student‑t, Laplace) and novel ones such as LSM1 (exponential location, square‑root gamma scale) and LSM2 (asymmetric Laplace location, square‑root gamma scale). These new models allow independent tuning of tail thickness and spatial correlation, a flexibility unavailable in standard Gaussian mixtures or pure max‑stable models.
A major contribution is a general algorithm for conditional simulation. Given observations at a subset of sites, the algorithm samples the latent variables (S) and (R) from their posterior distributions using a Gibbs or Metropolis‑within‑Gibbs scheme, then draws the conditional Gaussian field ({W(s)}) analytically. This retains the computational efficiency of Gaussian conditioning while extending it to the non‑Gaussian mixture setting, enabling fast generation of realistic extreme‑value fields over large grids.
Because direct evaluation of the full likelihood involves high‑dimensional integration over (S) and (R), the authors propose two scalable inference strategies. The first is an EM‑type algorithm where the E‑step is approximated by Monte‑Carlo integration, yielding unbiased estimates of the expected complete‑data log‑likelihood. The second is a variational Bayes approach that approximates the posterior of ((S,R)) with a tractable family (e.g., Gaussian–Gamma), leading to closed‑form updates for many parameter choices, especially when (R) has an inverse‑gamma prior. Both methods dramatically reduce computational cost while preserving statistical efficiency, as demonstrated in extensive simulation studies.
The methodology is illustrated on a real dataset: the Fire Weather Index (FWI) measured at roughly 500 spatial locations across Portugal over more than two decades. FWI exhibits strong skewness, heavy upper tails, and spatially varying dependence—features that standard max‑stable models fail to capture. Fitting several location‑scale mixture models (including LSM1 and LSM2) yields substantially higher log‑likelihoods, lower AIC values, and improved predictive performance in cross‑validation compared with a benchmark max‑stable model. Moreover, conditional simulations from the fitted mixtures produce realistic synthetic fire‑weather fields, enabling risk assessment for extreme fire‑season scenarios.
In summary, the paper makes three key contributions: (1) it unifies bulk and tail modeling within a single, continuous framework; (2) it introduces novel mixture constructions that decouple tail heaviness from spatial correlation, allowing a continuum of extremal dependence structures; and (3) it provides practical algorithms for conditional simulation and likelihood‑based inference that scale to high‑dimensional spatial data. The proposed Gaussian location‑scale mixtures thus represent a powerful, versatile tool for modern environmental statistics, with potential extensions to multivariate non‑spatial settings.
Comments & Academic Discussion
Loading comments...
Leave a Comment