Guided Diffusion Sampling on Function Spaces with Applications to PDEs

Guided Diffusion Sampling on Function Spaces with Applications to PDEs
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We propose a general framework for conditional sampling in PDE-based inverse problems, targeting the recovery of whole solutions from extremely sparse or noisy measurements. This is accomplished by a function-space diffusion model and plug-and-play guidance for conditioning. Our method first trains an unconditional, discretization-agnostic denoising model using neural operator architectures. At inference, we refine the samples to satisfy sparse observation data via a gradient-based guidance mechanism. Through rigorous mathematical analysis, we extend Tweedie’s formula to infinite-dimensional Banach spaces, providing the theoretical foundation for our posterior sampling approach. Our method (FunDPS) accurately captures posterior distributions in function spaces under minimal supervision and severe data scarcity. Across five PDE tasks with only 3% observation, our method achieves an average 32% accuracy improvement over state-of-the-art fixed-resolution diffusion baselines while reducing sampling steps by 4x. Furthermore, multi-resolution fine-tuning ensures strong cross-resolution generalizability. To the best of our knowledge, this is the first diffusion-based framework to operate independently of discretization, offering a practical and flexible solution for forward and inverse problems in the context of PDEs. Code is available at https://github.com/neuraloperator/FunDPS


💡 Research Summary

The paper introduces FunDPS (Function‑space Diffusion Posterior Sampling), a novel framework for conditional sampling in PDE‑based inverse problems where observations are extremely sparse or noisy. Traditional diffusion models operate on fixed‑resolution grids and require task‑specific conditional score models, limiting their flexibility for scientific computing. FunDPS overcomes these limitations by training an unconditional denoising model using neural operator architectures that treat the PDE parameters and solution as a single joint function defined on a continuous domain.

A key theoretical contribution is the extension of Tweedie’s formula to infinite‑dimensional Banach spaces. By modeling measurement noise as a Gaussian random field (GRF) and invoking the Cameron‑Martin theorem, the authors derive a closed‑form relationship between the score of the noisy distribution and the conditional expectation of the clean sample. This yields a decomposition of the conditional score into a prior term (the pretrained unconditional score) and a likelihood gradient term that encodes the measurement operator. Consequently, a plug‑and‑play guidance term can be inserted into the reverse diffusion dynamics without training any additional conditional model.

The practical algorithm proceeds in two stages. First, a U‑shaped neural operator is pretrained on coarse grids using the EDM loss and GRF noise, then fine‑tuned on higher‑resolution grids. This multi‑resolution strategy reduces training compute by roughly 25 % and enables most sampling steps to be performed at low resolution, with only a final up‑sampling phase, achieving an additional 2× speed‑up. During inference, samples are initialized with GRF noise and iteratively denoised using a deterministic second‑order sampler. At each timestep the update combines the unconditional score and the guidance term derived from the extended Tweedie formula, ensuring that the evolving sample remains consistent with the sparse observations or PDE constraints.

Experiments cover five challenging PDE tasks—including Darcy flow, Helmholtz, and Navier‑Stokes—under a severe 3 % observation regime. FunDPS consistently outperforms the state‑of‑the‑art fixed‑resolution diffusion baseline (DiffusionPDE), reducing average L2 error by about 32 % while requiring only 200–500 reverse‑diffusion steps (up to 10× fewer). Wall‑clock time is cut by a factor of 25, and multi‑resolution inference further halves the runtime without sacrificing accuracy. The method also surpasses deterministic neural PDE solvers by a noticeable margin.

Compared with related work, FunDPS uniquely combines (1) an unconditional neural‑operator prior that can be reused across many downstream tasks, (2) a mathematically rigorous infinite‑dimensional Tweedie‑based guidance mechanism, and (3) resolution‑independent training and sampling. Limitations include reliance on the Gaussian noise assumption (which may not hold for all sensor modalities) and the need for further validation on highly nonlinear boundary conditions and irregular domains. Future directions suggested are non‑Gaussian noise modeling, adaptive resolution selection, and integration with real‑time physics simulators. Overall, FunDPS represents a significant step toward flexible, efficient, and theoretically grounded posterior sampling for PDE‑driven scientific applications.


Comments & Academic Discussion

Loading comments...

Leave a Comment