Asymptotics for conformal inference

Asymptotics for conformal inference
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Conformal inference is a versatile tool for building prediction sets in regression or classification. We study the false coverage proportion (FCP) in a simultaneous inference setting with a calibration sample of $n$ points and a test sample of $m$ points. We identify the exact, distribution-free, asymptotic distribution of the FCP when both $n$ and $m$ tend to infinity. This shows in particular that FCP control can be achieved by using the well-known Kolmogorov distribution, and puts forward that the asymptotic variance is decreasing in the ratio $n/m$. We then provide a number of extensions by considering the problems of novelty detection, weighted conformal inference or distribution shift between the calibration sample and the test sample. In particular, our asymptotic results allow to accurately quantify the asymptotic behavior of the errors (a miscovering interval or declaring a false novelty) when weighted conformal inference is used.


💡 Research Summary

This paper investigates the false coverage proportion (FCP) of split‑conformal prediction when both the calibration sample size (n) and the test sample size (m) grow to infinity. The authors treat the collection of conformal p‑values ({p_i^{(n)}}{i=1}^m) as an empirical cumulative distribution function (ECDF) and study its joint asymptotic behavior under a double‑asymptotic regime where (\tau{n,m}=nm/(n+m)\to\infty).

The main theoretical contribution (Theorem 3.1) shows that, after scaling by (\sqrt{\tau_{n,m}}), the process (FCP_{n}^{m}(\alpha)-I_n(\alpha)) converges in distribution to a standard Brownian bridge (B(\alpha)) uniformly over (\alpha\in


Comments & Academic Discussion

Loading comments...

Leave a Comment