Projected Sobolev Natural Gradient Descent for Efficient Neural Network Solution of the Gross-Pitaevskii Equation
This paper introduces a projected Sobolev natural gradient descent (NGD) method for computing ground states of the Gross-Pitaevskii equation. By projecting a continuous Riemannian Sobolev gradient flow onto the normalized neural network tangent space, we derive a discrete NGD algorithm that preserves the normalization constraint. The numerical implementation employs variational Monte Carlo with a hybrid sampling strategy to accurately account for the normalization constant arising from nonlinear interaction terms. To enhance computational efficiency, a matrix-free Nyström-preconditioned conjugate gradient solver is adopted to approximate the NGD operator without explicit matrix assembly. Numerical experiments demonstrate that the proposed method converges significantly faster than physics-informed neural network approaches and exhibits linear scalability with respect to spatial dimensions. Moreover, the resulting neural-network solutions provide high-quality initial guesses that substantially accelerate subsequent refinement by traditional high-precision solvers.
💡 Research Summary
The paper presents a novel computational framework for finding ground‑state solutions of the Gross‑Pitaevskii equation (GPE), a nonlinear Schrödinger equation that models Bose‑Einstein condensates in the mean‑field regime. The authors adopt an “optimize‑then‑discretize” philosophy: they first formulate a continuous Riemannian Sobolev gradient flow on the unit‑norm manifold of wavefunctions and then project this flow onto the finite‑dimensional manifold induced by a deep neural network (DNN) ansatz.
Key theoretical ingredients are: (i) the definition of an energy‑adaptive Sobolev metric a_ψ(v,w)=∫
Comments & Academic Discussion
Loading comments...
Leave a Comment