eCP: Informative uncertainty quantification via Equivariantized Conformal Prediction with pre-trained models
We study the effect of group symmetrization of pre-trained models on conformal prediction (CP), a post-hoc, distribution-free, finite-sample method of uncertainty quantification that offers formal coverage guarantees under the assumption of data exchangeability. Unfortunately, CP uncertainty regions can grow significantly in long horizon missions, rendering the statistical guarantees uninformative. To that end, we propose infusing CP with geometric information via group-averaging of the pretrained predictor to distribute the non-conformity mass across the orbits. Each sample now is treated as a representative of an orbit, thus uncertainty can be mitigated by other samples entangled to it via the orbit inducing elements of the symmetry group. Our approach provably yields contracted non-conformity scores in increasing convex order, implying improved exponential-tail bounds and sharper conformal prediction sets in expectation, especially at high confidence levels. We then propose an experimental design to test these theoretical claims in pedestrian trajectory prediction.
💡 Research Summary
**
This paper addresses a well‑known limitation of conformal prediction (CP): while CP guarantees finite‑sample, distribution‑free coverage under the mild exchangeability assumption, the resulting prediction sets can become excessively large, especially at high confidence levels or over long prediction horizons. The authors propose a simple yet powerful post‑hoc augmentation called Equivariantized Conformal Prediction (eCP) that leverages known geometric symmetries of the data and of a pretrained predictor to shrink these sets without sacrificing coverage.
The core idea is to treat each calibration sample not as an isolated point but as a representative of its entire orbit under a group (G) (e.g., rotations, translations, reflections, time‑shifts). For a given non‑conformity score (s(f(x),y)), eCP replaces it with a symmetrized version (\Pi_G
Comments & Academic Discussion
Loading comments...
Leave a Comment