Regularized Random Fourier Features and Finite Element Reconstruction for Operator Learning in Sobolev Space
Operator learning is a data-driven approximation of mappings between infinite-dimensional function spaces, such as the solution operators of partial differential equations. Kernel-based operator learning can offer accurate, theoretically justified approximations that require less training than standard methods. However, they can become computationally prohibitive for large training sets and can be sensitive to noise. We propose a regularized random Fourier feature (RRFF) approach, coupled with a finite element reconstruction map (RRFF-FEM), for learning operators from noisy data. The method uses random features drawn from multivariate Student’s $t$ distributions, together with frequency-weighted Tikhonov regularization that suppresses high-frequency noise. We establish high-probability bounds on the extreme singular values of the associated random feature matrix and show that when the number of features $N$ scales like $m \log m$ with the number of training samples $m$, the system is well-conditioned, which yields estimation and generalization guarantees. Detailed numerical experiments on benchmark PDE problems, including advection, Burgers’, Darcy flow, Helmholtz, Navier-Stokes, and structural mechanics, demonstrate that RRFF and RRFF-FEM are robust to noise and achieve improved performance with reduced training time compared to the unregularized random feature model, while maintaining competitive accuracy relative to kernel and neural operator tests.
💡 Research Summary
**
This paper addresses the problem of learning operators that map between infinite‑dimensional function spaces, a task that arises when one wishes to approximate solution operators of partial differential equations (PDEs) from data. Kernel‑based operator learning enjoys strong theoretical guarantees and high accuracy, but its computational cost scales quadratically with the number of training samples because it requires forming and inverting an (m\times m) kernel matrix. Random Fourier features (RFF) provide a popular remedy by approximating the kernel with a low‑dimensional random feature map, yet standard RFF suffers from two major drawbacks: (1) the random frequencies are usually drawn from a Gaussian distribution, which does not explicitly control high‑frequency components, and (2) the learning problem is solved by minimum‑norm interpolation, making the method extremely sensitive to measurement noise.
The authors propose a regularized random Fourier feature (RRFF) framework together with a finite‑element reconstruction (FEM) map, yielding the combined method RRFF‑FEM. The key innovations are:
-
Student’s‑t random frequencies – Instead of Gaussian sampling, the random vectors ({\omega_k}_{k=1}^N) are drawn i.i.d. from a multivariate Student’s‑t distribution with degrees of freedom (\nu). By varying (\nu) one interpolates continuously between Gaussian ((\nu\to\infty)) and Cauchy ((\nu=1)) tails. Heavy‑tailed distributions naturally down‑weight very high frequencies, which are the frequencies most affected by noise.
-
Frequency‑weighted Tikhonov regularization – The coefficients (\mathbf{c}^{(j)}) for each output component are obtained by solving a regularized least‑squares problem \
Comments & Academic Discussion
Loading comments...
Leave a Comment