Time-adaptive functional Gaussian Process regression
This paper proposes a new formulation of functional Gaussian Process regression in manifolds, based on an Empirical Bayes approach, in the spatiotemporal random field context. We apply the machinery of tight Gaussian measures in separable Hilbert spaces, exploiting the invariance property of covariance kernels under the group of isometries of the manifold. The identification of these measures with infinite-product Gaussian measures is then obtained via the eigenfunctions of the Laplace-Beltrami operator on the manifold. The involved time-varying angular spectra constitute the key tool for dimension reduction in the implementation of this regression approach, adopting a suitable truncation scheme depending on the functional sample size. The simulation study and synthetic data application undertaken illustrate the finite sample and asymptotic properties of the proposed functional regression predictor.
💡 Research Summary
The paper introduces a novel framework for functional Gaussian Process (FGP) regression on compact Riemannian manifolds, addressing the computational bottleneck that arises from inverting large covariance matrices in traditional GP models. The authors adopt an Empirical Bayes (EB) strategy combined with a rigorous infinite‑dimensional probabilistic formulation. Central to the approach is the identification of a centered, non‑degenerate Gaussian measure μ_R on a separable Hilbert space H = L²(Mᵈ, dν) with an infinite product of one‑dimensional Gaussian measures. This identification relies on the eigenvalues λ_k and eigenfunctions e_k of the Laplace–Beltrami operator Δᵈ, which diagonalize any covariance kernel that is invariant under the manifold’s isometry group. Consequently, the covariance kernel C_Mᵈ(x, y, τ) can be expressed as a spectral sum C_Mᵈ(x, y, τ) = Σ_n B_n(τ) Σ_{j=1}^{Γ(n,d)} S_{n,j}(x) S_{n,j}(y), where B_n(τ) are time‑varying scalar spectra and S_{n,j} are the Laplace–Beltrami eigenfunctions.
By expanding the functional random field Z_t(x) in this eigenbasis, the infinite‑dimensional process decomposes into independent scalar Gaussian processes Z_{n,j}(t) with variance B_n(t). This “pure‑point” spectral representation enables the authors to work entirely in the spectral domain, where each mode is a simple one‑dimensional Gaussian. The key computational challenge—selecting a finite number of modes— is addressed through a time‑adaptive truncation scheme. Two truncation strategies are examined: a logarithmic rule (N_T ≈ log T) and a power‑law rule (N_T ≈ T^α, 0 < α < 1). The logarithmic rule yields substantial computational savings for large sample sizes while preserving asymptotic consistency; the power‑law rule retains more high‑frequency information and improves performance for small samples, at the cost of higher computational load.
The EB procedure proceeds sequentially in time. At each observation time t, the marginal likelihood p(Y_t | θ(t), σ(t)) is maximized with respect to the hyperparameters θ(t) (which parametrize the spectra B_n(t)) and the observation noise variance σ²(t). The marginal likelihood incorporates a Fredholm determinant det(I − ω R_t) that captures the infinite‑dimensional nature of the covariance operator R_t. Maximization is performed via standard numerical optimization (e.g., L‑BFGS), and the resulting estimates (θ̂(t), σ̂(t)) are used to construct the posterior Gaussian measure μ_{R_t|Y_t}. Thanks to the l² identification, the posterior factorizes into a product of one‑dimensional Gaussian measures with updated means μ_{n,t}^{post} and variances (B_n(t)+σ̂²(t))^{-1}. The posterior mean of the functional field is then a truncated spectral sum:
\
Comments & Academic Discussion
Loading comments...
Leave a Comment