Preconditioning and Numerical Stability in Neural Network Training for Parametric PDEs

Preconditioning and Numerical Stability in Neural Network Training for Parametric PDEs
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

In the context of training neural network-based approximations of solutions of parameter-dependent PDEs, we investigate the effect of preconditioning via well-conditioned frame representations of operators and demonstrate a significant improvement on the performance of standard training methods. We also observe that standard representations of preconditioned matrices are insufficient for obtaining numerical stability and propose a generally applicable form of stable representations that enables computations with single- and half-precision floating point numbers without loss of precision.


💡 Research Summary

This paper addresses the training of neural network approximations for solutions of parameter‑dependent partial differential equations (PDEs) by introducing a hybrid representation that separates spatial and parametric variables. The spatial dependence is captured by a fixed set of basis functions (e.g., finite‑element or reduced‑basis functions)  ${\phi_i}_{i\in I}$, while the parametric coefficients $u_i(y;\theta)$ are modeled by a neural network with parameters $\theta$. The resulting ansatz reads
\


Comments & Academic Discussion

Loading comments...

Leave a Comment