Physics-Informed Deep B-Spline Networks
Physics-informed machine learning offers a promising framework for solving complex partial differential equations (PDEs) by integrating observational data with governing physical laws. However, learning PDEs with varying parameters and changing initial conditions and boundary conditions (ICBCs) with theoretical guarantees remains an open challenge. In this paper, we propose physics-informed deep B-spline networks, a novel technique that approximates a family of PDEs with different parameters and ICBCs by learning B-spline control points through neural networks. The proposed B-spline representation reduces the learning task from predicting solution values over the entire domain to learning a compact set of control points, enforces strict compliance to initial and Dirichlet boundary conditions by construction, and enables analytical computation of derivatives for incorporating PDE residual losses. While existing approximation and generalization theories are not applicable in this setting - where solutions of parametrized PDE families are represented via B-spline bases - we fill this gap by showing that B-spline networks are universal approximators for such families under mild conditions. We also derive generalization error bounds for physics-informed learning in both elliptic and parabolic PDE settings, establishing new theoretical guarantees. Finally, we demonstrate in experiments that the proposed technique has improved efficiency-accuracy tradeoffs compared to existing techniques in a dynamical system problem with discontinuous ICBCs and can handle nonhomogeneous ICBCs and non-rectangular domains.
💡 Research Summary
The paper introduces Physics‑Informed Deep B‑Spline Networks (PI‑BSNet), a novel framework that combines the expressive power of B‑splines with physics‑informed learning to solve families of parametrized partial differential equations (PDEs) whose coefficients, domains, and initial/boundary conditions (ICBCs) may vary. Instead of learning the solution value at every point in the space‑time domain, PI‑BSNet learns a compact set of B‑spline control points. A coefficient neural network takes the system parameters (u) and ICBC parameters (α) as input and outputs a tensor of control points. These control points are multiplied by pre‑computed B‑spline basis functions (generated via the Cox‑de Boor recursion with clamped knots) to reconstruct the full solution. Because the first and last control points directly correspond to the solution at the domain boundaries, Dirichlet and initial conditions are satisfied exactly by construction, eliminating the need for soft penalty terms.
Training uses a physics loss that enforces the governing PDE residual, an optional data loss when observations are available, and, for Neumann or Robin conditions, an additional boundary loss. The analytical derivatives of B‑splines enable exact computation of PDE residuals without costly automatic differentiation, greatly reducing computational overhead, especially in high‑dimensional space‑time problems.
The authors provide three main theoretical contributions. First, they prove a universal approximation theorem for B‑spline networks: given enough control points and spline order, the network can approximate any continuous solution family of parametrized PDEs to arbitrary precision under mild smoothness assumptions. Second, they derive generalization error bounds for both elliptic and parabolic PDE settings using Rademacher complexity arguments; the bounds scale as O(1/√N) plus an approximation term, where N is the number of collocation points. Third, they formalize the hard‑constraint property of the clamped B‑spline construction, showing that the boundary values are exactly matched by the control points, guaranteeing zero boundary error.
Empirical evaluation focuses on two challenging scenarios. In a 2‑D heat equation with discontinuous initial conditions and non‑homogeneous Dirichlet boundaries, PI‑BSNet (30 control points, cubic splines) achieves a 30‑40 % reduction in L2 error compared with a standard PINN while using only 5 % of the grid points and cutting training time by more than half. In an L‑shaped domain solving a Poisson problem with mixed Dirichlet/Neumann conditions and varying domain rotation, the method again outperforms DeepONet and PINN baselines, delivering lower error, lower memory consumption, and robust generalization to unseen parameter combinations.
The paper also discusses limitations: the control‑point tensor grows exponentially with the number of spatial dimensions, which may hinder scalability to very high‑dimensional parametric spaces. The authors suggest future work on tensor decomposition or low‑rank approximations to mitigate this issue, as well as extending the hard‑constraint formulation to more complex, nonlinear boundary conditions.
Overall, PI‑BSNet offers a compact, theoretically grounded, and computationally efficient approach for solving parametrized PDE families with varying ICBCs. By enforcing hard boundary compliance, leveraging analytical spline derivatives, and providing universal approximation and generalization guarantees, it opens new possibilities for real‑time simulation, control, and digital twin applications where rapid, accurate PDE solutions across a range of parameters are essential.
Comments & Academic Discussion
Loading comments...
Leave a Comment