Foundation Inference Models for Ordinary Differential Equations
Ordinary differential equations (ODEs) are central to scientific modelling, but inferring their vector fields from noisy trajectories remains challenging. Current approaches such as symbolic regression, Gaussian process (GP) regression, and Neural ODEs often require complex training pipelines and substantial machine learning expertise, or they depend strongly on system-specific prior knowledge. We propose FIM-ODE, a pretrained Foundation Inference Model that amortises low-dimensional ODE inference by predicting the vector field directly from noisy trajectory data in a single forward pass. We pretrain FIM-ODE on a prior distribution over ODEs with low-degree polynomial vector fields and represent the target field with neural operators. FIM-ODE achieves strong zero-shot performance, matching and often improving upon ODEFormer, a recent pretrained symbolic baseline, across a range of regimes despite using a simpler pretraining prior distribution. Pretraining also provides a strong initialisation for finetuning, enabling fast and stable adaptation that outperforms modern neural and GP baselines without requiring machine learning expertise.
💡 Research Summary
The paper introduces FIM-ODE, a Foundation Inference Model designed to infer the vector field of ordinary differential equations (ODEs) directly from noisy trajectory observations in a single forward pass. Traditional ODE inference methods—symbolic regression, Gaussian‑process (GP) regression, and Neural ODEs—typically require high‑quality derivative estimates, carefully crafted priors, or expensive back‑propagation through numerical solvers, making them difficult to apply without substantial machine‑learning expertise. FIM-ODE addresses these challenges by amortising the inference process through large‑scale pretraining on synthetic data.
The pretraining prior is deliberately simple: vector fields are constructed as sparse multivariate polynomials of total degree ≤ 3, with coefficients drawn from a standard normal distribution and random masks that enforce sparsity. Despite this simplicity, low‑degree polynomials can generate a rich repertoire of dynamical behaviours (fixed points, limit cycles, chaos) and are locally Lipschitz, guaranteeing existence and uniqueness of solutions. For each sampled field, multiple trajectories are generated by integrating the ODE from Gaussian‑distributed initial conditions over a fixed time window
Comments & Academic Discussion
Loading comments...
Leave a Comment