Search Results for author: Emmanuel J. Candès

Found 19 papers, 14 papers with code

Conformalized Quantile Regression

4 code implementations NeurIPS 2019 Yaniv Romano, Evan Patterson, Emmanuel J. Candès

Conformal prediction is a technique for constructing prediction intervals that attain valid coverage in finite samples, without making distributional assumptions.

Conformal Prediction Prediction Intervals +2

With Malice Towards None: Assessing Uncertainty via Equalized Coverage

1 code implementation15 Aug 2019 Yaniv Romano, Rina Foygel Barber, Chiara Sabatti, Emmanuel J. Candès

An important factor to guarantee a fair use of data-driven recommendation systems is that we should be able to communicate their uncertainty to decision makers.

Prediction Intervals Recommendation Systems

Cross-Prediction-Powered Inference

2 code implementations28 Sep 2023 Tijana Zrnic, Emmanuel J. Candès

We show that cross-prediction is consistently more powerful than an adaptation of prediction-powered inference in which a fraction of the labeled data is split off and used to train the model.

Decision Making Missing Labels

Learn then Test: Calibrating Predictive Algorithms to Achieve Risk Control

1 code implementation3 Oct 2021 Anastasios N. Angelopoulos, Stephen Bates, Emmanuel J. Candès, Michael I. Jordan, Lihua Lei

We introduce a framework for calibrating machine learning models so that their predictions satisfy explicit, finite-sample statistical guarantees.

BIG-bench Machine Learning Instance Segmentation +3

Deep Knockoffs

4 code implementations16 Nov 2018 Yaniv Romano, Matteo Sesia, Emmanuel J. Candès

This paper introduces a machine for sampling approximate model-X knockoffs for arbitrary and unspecified data distributions using deep generative models.

Variable Selection

Classification with Valid and Adaptive Coverage

2 code implementations NeurIPS 2020 Yaniv Romano, Matteo Sesia, Emmanuel J. Candès

Conformal inference, cross-validation+, and the jackknife+ are hold-out methods that can be combined with virtually any machine learning algorithm to construct prediction sets with guaranteed marginal coverage.

Classification General Classification +1

Conformal Inference of Counterfactuals and Individual Treatment Effects

2 code implementations11 Jun 2020 Lihua Lei, Emmanuel J. Candès

At the moment, much emphasis is placed on the estimation of the conditional average treatment effect via flexible machine learning algorithms.

Decision Making Uncertainty Quantification

Selection by Prediction with Conformal p-values

2 code implementations4 Oct 2022 Ying Jin, Emmanuel J. Candès

Decision making or scientific discovery pipelines such as job hiring and drug discovery often involve multiple stages: before any resource-intensive step, there is often an initial screening that uses predictions from a machine learning model to shortlist a few candidates from a large pool.

Decision Making Drug Discovery

A comparison of some conformal quantile regression methods

1 code implementation12 Sep 2019 Matteo Sesia, Emmanuel J. Candès

We compare two recently proposed methods that combine ideas from conformal inference and quantile regression to produce locally adaptive and marginally valid prediction intervals under sample exchangeability (Romano et al., 2019; Kivaranovic et al., 2019).

Prediction Intervals regression +1

Statistical Inference for Fairness Auditing

1 code implementation5 May 2023 John J. Cherian, Emmanuel J. Candès

Our methods can be used to flag subpopulations affected by model underperformance, and certify subpopulations for which the model performs adequately.

Fairness

Active Statistical Inference

1 code implementation5 Mar 2024 Tijana Zrnic, Emmanuel J. Candès

This means that for the same number of collected samples, active inference enables smaller confidence intervals and more powerful p-values.

Active Learning valid

Interpretable Classification of Bacterial Raman Spectra with Knockoff Wavelets

1 code implementation8 Jun 2020 Charmaine Chia, Matteo Sesia, Chi-Sing Ho, Stefanie S. Jeffrey, Jennifer Dionne, Emmanuel J. Candès, Roger T. Howe

Deep neural networks and other sophisticated machine learning models are widely applied to biomedical signal data because they can detect complex patterns and compute accurate predictions.

Classification General Classification +2

Conformalized Survival Analysis

2 code implementations17 Mar 2021 Emmanuel J. Candès, Lihua Lei, Zhimei Ren

Existing survival analysis techniques heavily rely on strong modelling assumptions and are, therefore, prone to model misspecification errors.

Conformal Prediction Prediction Intervals +3

Achieving Equalized Odds by Resampling Sensitive Attributes

1 code implementation NeurIPS 2020 Yaniv Romano, Stephen Bates, Emmanuel J. Candès

We present a flexible framework for learning predictive models that approximately satisfy the equalized odds notion of fairness.

Attribute Fairness +3

The Likelihood Ratio Test in High-Dimensional Logistic Regression Is Asymptotically a Rescaled Chi-Square

no code implementations5 Jun 2017 Pragya Sur, Yuxin Chen, Emmanuel J. Candès

When used for the purpose of statistical inference, logistic models produce p-values for the regression coefficients by using an approximation to the distribution of the likelihood-ratio test.

regression

Robust subspace clustering

no code implementations11 Jan 2013 Mahdi Soltanolkotabi, Ehsan Elhamifar, Emmanuel J. Candès

Subspace clustering refers to the task of finding a multi-subspace representation that best fits a collection of points taken from a high-dimensional space.

Clustering

The limits of distribution-free conditional predictive inference

no code implementations12 Mar 2019 Rina Foygel Barber, Emmanuel J. Candès, Aaditya Ramdas, Ryan J. Tibshirani

We consider the problem of distribution-free predictive inference, with the goal of producing predictive coverage guarantees that hold conditionally rather than marginally.

Statistics Theory Statistics Theory

SLOPE - Adaptive variable selection via convex optimization

no code implementations14 Jul 2014 Małgorzata Bogdan, Ewout van den Berg, Chiara Sabatti, Weijie Su, Emmanuel J. Candès

SLOPE, short for Sorted L-One Penalized Estimation, is the solution to \[\min_{b\in\mathbb{R}^p}\frac{1}{2}\Vert y-Xb\Vert _{\ell_2}^2+\lambda_1\vert b\vert _{(1)}+\lambda_2\vert b\vert_{(2)}+\cdots+\lambda_p\vert b\vert_{(p)},\] where $\lambda_1\ge\lambda_2\ge\cdots\ge\lambda_p\ge0$ and $\vert b\vert_{(1)}\ge\vert b\vert_{(2)}\ge\cdots\ge\vert b\vert_{(p)}$ are the decreasing absolute values of the entries of $b$.

Methodology

Cannot find the paper you are looking for? You can Submit a new open access paper.