ST-BCP: Tightening Coverage Bound for Backward Conformal Prediction via Non-Conformity Score Transformation
Conformal Prediction (CP) provides a statistical framework for uncertainty quantification that constructs prediction sets with coverage guarantees. While CP yields uncontrolled prediction set sizes, Backward Conformal Prediction (BCP) inverts this paradigm by enforcing a predefined upper bound on set size and estimating the resulting coverage guarantee. However, the looseness induced by Markov’s inequality within the BCP framework causes a significant gap between the estimated coverage bound and the empirical coverage. In this work, we introduce ST-BCP, a novel method that introduces a data-dependent transformation of nonconformity scores to narrow the coverage gap. In particular, we develop a computable transformation and prove that it outperforms the baseline identity transformation. Extensive experiments demonstrate the effectiveness of our method, reducing the average coverage gap from 4.20% to 1.12% on common benchmarks.
💡 Research Summary
Conformal Prediction (CP) offers distribution‑free uncertainty quantification by constructing prediction sets that contain the true label with a user‑specified probability. While theoretically appealing, CP often yields overly large prediction sets, limiting its practical usefulness. Backward Conformal Prediction (BCP) inverts the CP paradigm: it imposes a hard upper bound T on the size of the prediction set and then estimates the resulting coverage guarantee. BCP achieves this by defining a data‑dependent miscoverage level \tilde α as the smallest α that satisfies the size constraint, and then using Markov’s inequality on an e‑variable to obtain a lower bound on coverage. The expectation E
Comments & Academic Discussion
Loading comments...
Leave a Comment