Online conformal inference for multi-step time series forecasting

Online conformal inference for multi-step time series forecasting
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We consider the problem of constructing distribution-free prediction intervals for multi-step time series forecasting, with a focus on the temporal dependencies inherent in multi-step forecast errors. We establish that the optimal $h$-step-ahead forecast errors exhibit serial correlation up to lag $(h-1)$ under a general non-stationary autoregressive data generating process. To leverage these properties, we propose the Autocorrelated Multi-step Conformal Prediction (AcMCP) method, which effectively incorporates autocorrelations in multi-step forecast errors, resulting in more statistically efficient prediction intervals. This method guarantees asymptotic marginal coverage for multi-step prediction intervals, though we note that, for finite samples, the coverage error admits an upper bound that increases with the forecasting horizon. Additionally, we extend several easy-to-implement conformal prediction methods, originally designed for single-step forecasting, to accommodate multi-step scenarios. Through empirical evaluations, including simulations and applications to data, we demonstrate that AcMCP achieves coverage that closely aligns with the target within local windows, while providing adaptive prediction intervals that effectively respond to varying conditions.


💡 Research Summary

This paper tackles the challenge of constructing distribution‑free prediction intervals for multi‑step time‑series forecasting, where the usual exchangeability assumption of conformal prediction is violated by temporal dependence. The authors first establish a theoretical result: under a very general non‑stationary autoregressive data‑generating process, the optimal h‑step‑ahead forecast error can be expressed as a linear combination of at most its previous (h − 1) errors, implying serial correlation up to lag (h − 1). This insight reveals that multi‑step forecast errors are not independent and that ignoring this structure leads to overly conservative intervals, especially for larger horizons.

Building on this, the authors propose the Autocorrelated Multi‑step Conformal Prediction (AcMCP) method. AcMCP operates within the split‑conformal framework but replaces the usual random split with a sequential split that respects the time order. At each time t a forecasting model is trained on a proper training window, then H‑step‑ahead point forecasts are produced and the signed forecast errors (nonconformity scores) sₜ₊ₕ|ₜ are recorded for all horizons h ∈


Comments & Academic Discussion

Loading comments...

Leave a Comment