Search Results for author: Emmanuel J. Candès

Found 15 papers, 10 papers with code

Learn then Test: Calibrating Predictive Algorithms to Achieve Risk Control

1 code implementation3 Oct 2021 Anastasios N. Angelopoulos, Stephen Bates, Emmanuel J. Candès, Michael I. Jordan, Lihua Lei

We introduce a framework for calibrating machine learning models so that their predictions satisfy explicit, finite-sample statistical guarantees.

Instance Segmentation Multi-Label Classification +2

Conformalized Survival Analysis

2 code implementations17 Mar 2021 Emmanuel J. Candès, Lihua Lei, Zhimei Ren

Existing survival analysis techniques heavily rely on strong modelling assumptions and are, therefore, prone to model misspecification errors.

Survival Analysis Survival Prediction

Conformal Inference of Counterfactuals and Individual Treatment Effects

2 code implementations11 Jun 2020 Lihua Lei, Emmanuel J. Candès

At the moment, much emphasis is placed on the estimation of the conditional average treatment effect via flexible machine learning algorithms.

Decision Making

Achieving Equalized Odds by Resampling Sensitive Attributes

1 code implementation NeurIPS 2020 Yaniv Romano, Stephen Bates, Emmanuel J. Candès

We present a flexible framework for learning predictive models that approximately satisfy the equalized odds notion of fairness.

Fairness Multi-class Classification +1

Interpretable Classification of Bacterial Raman Spectra with Knockoff Wavelets

1 code implementation8 Jun 2020 Charmaine Chia, Matteo Sesia, Chi-Sing Ho, Stefanie S. Jeffrey, Jennifer Dionne, Emmanuel J. Candès, Roger T. Howe

Deep neural networks and other sophisticated machine learning models are widely applied to biomedical signal data because they can detect complex patterns and compute accurate predictions.

Classification General Classification +1

Classification with Valid and Adaptive Coverage

1 code implementation NeurIPS 2020 Yaniv Romano, Matteo Sesia, Emmanuel J. Candès

Conformal inference, cross-validation+, and the jackknife+ are hold-out methods that can be combined with virtually any machine learning algorithm to construct prediction sets with guaranteed marginal coverage.

Classification General Classification

A comparison of some conformal quantile regression methods

1 code implementation12 Sep 2019 Matteo Sesia, Emmanuel J. Candès

We compare two recently proposed methods that combine ideas from conformal inference and quantile regression to produce locally adaptive and marginally valid prediction intervals under sample exchangeability (Romano et al., 2019; Kivaranovic et al., 2019).

Prediction Intervals

With Malice Towards None: Assessing Uncertainty via Equalized Coverage

1 code implementation15 Aug 2019 Yaniv Romano, Rina Foygel Barber, Chiara Sabatti, Emmanuel J. Candès

An important factor to guarantee a fair use of data-driven recommendation systems is that we should be able to communicate their uncertainty to decision makers.

Prediction Intervals Recommendation Systems

Conformalized Quantile Regression

4 code implementations NeurIPS 2019 Yaniv Romano, Evan Patterson, Emmanuel J. Candès

Conformal prediction is a technique for constructing prediction intervals that attain valid coverage in finite samples, without making distributional assumptions.

Prediction Intervals

The limits of distribution-free conditional predictive inference

no code implementations12 Mar 2019 Rina Foygel Barber, Emmanuel J. Candès, Aaditya Ramdas, Ryan J. Tibshirani

We consider the problem of distribution-free predictive inference, with the goal of producing predictive coverage guarantees that hold conditionally rather than marginally.

Statistics Theory Statistics Theory

Dual-Reference Design for Holographic Coherent Diffraction Imaging

no code implementations7 Feb 2019 David A. Barmherzig, Ju Sun, Emmanuel J. Candès, T. J. Lane, Po-Nan Li

A new reference design is introduced for holographic coherent diffraction imaging.

Deep Knockoffs

4 code implementations16 Nov 2018 Yaniv Romano, Matteo Sesia, Emmanuel J. Candès

This paper introduces a machine for sampling approximate model-X knockoffs for arbitrary and unspecified data distributions using deep generative models.

Variable Selection

The Likelihood Ratio Test in High-Dimensional Logistic Regression Is Asymptotically a Rescaled Chi-Square

no code implementations5 Jun 2017 Pragya Sur, Yuxin Chen, Emmanuel J. Candès

When used for the purpose of statistical inference, logistic models produce p-values for the regression coefficients by using an approximation to the distribution of the likelihood-ratio test.

SLOPE - Adaptive variable selection via convex optimization

no code implementations14 Jul 2014 Małgorzata Bogdan, Ewout van den Berg, Chiara Sabatti, Weijie Su, Emmanuel J. Candès

SLOPE, short for Sorted L-One Penalized Estimation, is the solution to \[\min_{b\in\mathbb{R}^p}\frac{1}{2}\Vert y-Xb\Vert _{\ell_2}^2+\lambda_1\vert b\vert _{(1)}+\lambda_2\vert b\vert_{(2)}+\cdots+\lambda_p\vert b\vert_{(p)},\] where $\lambda_1\ge\lambda_2\ge\cdots\ge\lambda_p\ge0$ and $\vert b\vert_{(1)}\ge\vert b\vert_{(2)}\ge\cdots\ge\vert b\vert_{(p)}$ are the decreasing absolute values of the entries of $b$.

Methodology

Robust subspace clustering

no code implementations11 Jan 2013 Mahdi Soltanolkotabi, Ehsan Elhamifar, Emmanuel J. Candès

Subspace clustering refers to the task of finding a multi-subspace representation that best fits a collection of points taken from a high-dimensional space.

Cannot find the paper you are looking for? You can Submit a new open access paper.