130 papers with code • 0 benchmarks • 0 datasets
These leaderboards are used to track progress in Conformal Prediction
LibrariesUse these libraries to find Conformal Prediction models and implementations
Convolutional image classifiers can achieve high predictive accuracy, but quantifying their uncertainty remains an unresolved challenge, hindering their deployment in consequential settings.
Conformal prediction is a user-friendly paradigm for creating statistically rigorous uncertainty sets/intervals for the predictions of such models.
Estimating uncertainties associated with the predictions of Machine Learning (ML) models is of crucial importance to assess their robustness and predictive power.
We introduce new inference procedures for counterfactual and synthetic control methods for policy evaluation.
We develop a method to construct distribution-free prediction intervals for dynamic time-series, called \Verb|EnbPI| that wraps around any bootstrap ensemble estimator to construct sequential prediction intervals.