A Multi-Level Deep Framework for Deep Solvers of Partial Differential Equations
In this paper, inspired by the multigrid method, we propose a multi-level deep framework for deep solvers. Overall, it divides the entire training process into different levels of training. At each level of training, an adaptive sampling method proposed in this paper is first employed to obtain new training points, so that these points become increasingly concentrated in computational regions corresponding to high-frequency components. Then, the generalization ability of deep neural networks are utilized to update the PDEs for the next level of training based on the results from all previous levels. Rigorous mathematical proofs and detailed numerical experiments are employed to demonstrate the effectiveness of the proposed method.
💡 Research Summary
The paper introduces a novel multi‑level deep learning framework for solving partial differential equations (PDEs), inspired by the classical multigrid method. Recognizing that deep neural networks, when trained as physics‑informed solvers (PINNs), tend to learn low‑frequency components of the solution quickly while struggling with high‑frequency modes—a phenomenon known as the “frequency principle”—the authors propose to mimic the multigrid hierarchy within the training process.
The methodology consists of two intertwined components: multi‑level adaptive sampling and multi‑level training. In the first level, training points are drawn uniformly across the domain, and a neural network θ₁ is trained to minimize the standard PINN loss (residuals in the interior and boundary). After this stage, the residual r₁(x)=𝓛
Comments & Academic Discussion
Loading comments...
Leave a Comment