Probabilistic Forecasting via Autoregressive Flow Matching

Probabilistic Forecasting via Autoregressive Flow Matching
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

In this work, we propose FlowTime, a generative model for probabilistic forecasting of multivariate timeseries data. Given historical measurements and optional future covariates, we formulate forecasting as sampling from a learned conditional distribution over future trajectories. Specifically, we decompose the joint distribution of future observations into a sequence of conditional densities, each modeled via a shared flow that transforms a simple base distribution into the next observation distribution, conditioned on observed covariates. To achieve this, we leverage the flow matching (FM) framework, enabling scalable and simulation-free learning of these transformations. By combining this factorization with the FM objective, FlowTime retains the benefits of autoregressive models – including strong extrapolation performance, compact model size, and well-calibrated uncertainty estimates – while also capturing complex multi-modal conditional distributions, as seen in modern transport-based generative models. We demonstrate the effectiveness of FlowTime on multiple dynamical systems and real-world forecasting tasks.


💡 Research Summary

The paper introduces FlowTime, a novel generative framework for probabilistic forecasting of multivariate time‑series, built on the recently proposed flow‑matching (FM) paradigm. Traditional probabilistic forecasting approaches either model the entire future horizon jointly—often leading to poor extrapolation and mis‑calibrated uncertainties—or rely on diffusion models that require costly iterative denoising steps. FlowTime addresses these limitations by combining two key ideas: (1) an autoregressive factorization of the future distribution and (2) the use of continuous normalizing flows (CNFs) trained via flow‑matching.

In the autoregressive formulation, the conditional distribution of the future trajectory (Y_f) given past observations (Y_\ell) and covariates (C) is decomposed into a product of one‑step conditionals: \


Comments & Academic Discussion

Loading comments...

Leave a Comment