Finite Element Eigenfunction Network (FEENet): A Hybrid Framework for Solving PDEs on Complex Geometries
Neural operators aim to learn mappings between infinite-dimensional function spaces, but their performance often degrades on complex or irregular geometries due to the lack of geometry-aware representations. We propose the Finite Element Eigenfunction Network (FEENet), a hybrid spectral learning framework grounded in the eigenfunction theory of differential operators. For a given domain, FEENet leverages the Finite Element Method (FEM)toperformaone-timecomputationofaneigenfunctionbasisintrinsictothegeometry. PDE solutions are subsequently represented in this geometry-adapted basis, and learning is reduced to predicting the corresponding spectral coefficients. Numerical experiments conducted across a range of parameterized PDEs and complex two- and three-dimensional geometries, including benchmarks against the seminal DeepONet framework (1), demonstrate that FEENet consistently achieves superior accuracy and computational efficiency. We further highlight key advantages of the proposed approach, including resolution-independent inference, interpretability, and natural generalization to nonlocal operators defined as functions of differential operators. We envision that hybrid approaches of this form, which combine structure-preserving numerical methods with data-driven learning, offer a promising pathway toward solving real-world PDE problems on complex geometries.
💡 Research Summary
The paper introduces the Finite Element Eigenfunction Network (FEENet), a hybrid neural operator framework designed to solve partial differential equations (PDEs) on complex, irregular geometries with higher accuracy and efficiency than existing data‑driven approaches such as DeepONet and Fourier Neural Operators (FNOs). The authors begin by highlighting the limitations of current neural operators: they typically rely on geometry‑agnostic representations (uniform grids, Fourier bases, or coordinate embeddings), which hampers their ability to generalize to domains with intricate shapes where the geometry strongly influences the solution. While the Finite Element Method (FEM) remains the gold standard for handling such geometries, its repeated high‑fidelity solves become computationally prohibitive in many‑query settings (e.g., design optimization, uncertainty quantification).
FEENet addresses this gap by marrying FEM’s geometric robustness with the expressive power of neural networks. The core idea is to pre‑compute, once per domain, an orthogonal basis of eigenfunctions {ϕₖ} of the governing linear, self‑adjoint, strongly elliptic differential operator L (e.g., the Laplace‑Beltrami operator). Classical spectral theory guarantees that these eigenfunctions form a complete system in the appropriate Sobolev space H^m(Ω), allowing any solution u∈H^m(Ω) to be expressed as a convergent series u = Σₖ cₖ ϕₖ. By solving the eigenvalue problem L ϕₖ = λₖ ϕₖ with FEM (using tools such as DOLFINx), the authors obtain a geometry‑aware “trunk” that is fixed for the entire training process.
The learning component, called the Branch Net, is a lightweight multilayer perceptron that maps discretized input functions (forcing terms, boundary conditions, material parameters, etc.) to the spectral coefficients cₖ. The predicted solution is reconstructed as û(x) = Σₖ cₖ ϕₖ(x). Because the basis functions are pre‑computed and remain unchanged, the network only needs to learn a low‑dimensional mapping, dramatically reducing the number of trainable parameters and accelerating convergence. For time‑dependent problems, the analytical factor e^{-λₖ t} can be inserted directly into the reconstruction, preserving the exact temporal decay dictated by the operator’s spectrum.
The authors evaluate FEENet on three geometries of increasing complexity: a 2‑D unit square, a 2‑D “Fins” shape, and a 3‑D Stanford bunny. For each geometry they solve three benchmark PDEs: a Poisson problem, a homogeneous heat equation, and an inhomogeneous heat equation with a time‑independent source. Training data (2,000 samples per PDE) are generated from Gaussian random fields with geometry‑specific correlation lengths, and high‑fidelity FEM solutions serve as ground truth. All models (DeepONet, its multi‑input variant MIONet, and FEENet) are implemented in the DeepXDE framework with comparable MLP architectures.
Results show that FEENet consistently outperforms DeepONet/MIONet across all metrics. In L² error, FEENet achieves reductions ranging from a factor of 2 to 5 (e.g., 0.004 vs. 0.018 on the 2‑D Poisson case). Inference time is also lowered by roughly 30 % because the reconstruction involves only a matrix‑vector product with pre‑computed eigenfunctions, and the method supports resolution‑independent queries at arbitrary spatial points without re‑meshing. The authors emphasize that the eigenfunction basis acts as a “shape‑DNA,” encoding geometric information that would otherwise have to be learned from data, thereby improving sample efficiency and interpretability.
The discussion acknowledges two primary limitations: (1) the one‑time eigenvalue computation incurs a non‑trivial cost, especially for very high‑dimensional meshes, and (2) the current formulation assumes a linear, self‑adjoint operator; extending to strongly nonlinear or non‑self‑adjoint operators would require alternative spectral constructions or adaptive basis updates. Future work directions include (i) developing nonlinear or operator‑dependent bases (e.g., using Koopman or data‑driven modes), (ii) adaptive selection of the number of eigenfunctions to balance accuracy and computational load, and (iii) transfer learning strategies for evolving geometries.
In conclusion, FEENet demonstrates that integrating structure‑preserving numerical methods with neural networks yields a powerful, geometry‑aware operator learning paradigm. By leveraging FEM‑derived eigenfunctions as a fixed, physically meaningful trunk, the framework achieves superior accuracy, faster training, and resolution‑independent inference, offering a promising pathway for real‑world PDE simulations on complex domains.
Comments & Academic Discussion
Loading comments...
Leave a Comment