Memory-Conditioned Flow-Matching for Stable Autoregressive PDE Rollouts
Autoregressive generative PDE solvers can be accurate one step ahead yet drift over long rollouts, especially in coarse-to-fine regimes where each step must regenerate unresolved fine scales. This is the regime of diffusion and flow-matching generators: although their internal dynamics are Markovian, rollout stability is governed by per-step \emph{conditional law} errors. Using the Mori–Zwanzig projection formalism, we show that eliminating unresolved variables yields an exact resolved evolution with a Markov term, a memory term, and an orthogonal forcing, exposing a structural limitation of memoryless closures. Motivated by this, we introduce memory-conditioned diffusion/flow-matching with a compact online state injected into denoising via latent features. Via disintegration, memory induces a structured conditional tail prior for unresolved scales and reduces the transport needed to populate missing frequencies. We prove Wasserstein stability of the resulting conditional kernel. We then derive discrete Grönwall rollout bounds that separate memory approximation from conditional generation error. Experiments on compressible flows with shocks and multiscale mixing show improved accuracy and markedly more stable long-horizon rollouts, with better fine-scale spectral and statistical fidelity.
💡 Research Summary
This paper tackles a fundamental limitation of modern generative surrogate models for time‑dependent partial differential equations (PDEs). Diffusion‑based and flow‑matching solvers are Markovian in their internal time, yet when they are used autoregressively in physical time they must repeatedly regenerate unresolved fine‑scale components. Small per‑step distributional errors therefore accumulate, leading to severe drift in long‑horizon rollouts, especially in coarse‑to‑fine regimes where only low‑frequency modes are conditioned on and the high‑frequency “tail” must be sampled anew at each step.
The authors first formalize the problem using the Mori‑Zwanzig (MZ) projection formalism. By splitting the full state into resolved (low‑frequency) and unresolved (high‑frequency) parts, MZ shows that the exact reduced dynamics consist of three terms: a Markovian contribution, a history‑dependent memory integral, and an orthogonal forcing term. Consequently, any closure that conditions solely on the resolved variables (i.e., a memoryless kernel) is intrinsically misspecified because it neglects the memory term that carries essential information about the unresolved scales.
Motivated by this insight, the paper introduces Memory‑Conditioned Rectified Flow (MCRF), a novel architecture that augments conditional flow‑matching with a compact latent memory state. At each physical time step (n), a backbone network extracts a low‑frequency embedding (E_n) from the resolved field. A lightweight state‑space model (e.g., S4, HiPPO) updates a memory vector (m_n) from the sequence ({E_k}_{k\le n}). The memory is then injected back into the conditional flow‑matching ODE: \
Comments & Academic Discussion
Loading comments...
Leave a Comment