Search Results for author: Eric W. Tramel

Found 16 papers, 3 papers with code

Differentially Private Federated Learning for Cancer Prediction

1 code implementation8 Jan 2021 Constance Beguier, Jean Ogier du Terrail, Iqraa Meah, Mathieu Andreux, Eric W. Tramel

Since 2014, the NIH funded iDASH (integrating Data for Analysis, Anonymization, SHaring) National Center for Biomedical Computing has hosted yearly competitions on the topic of private computing for genomic data.

Federated Learning

Siloed Federated Learning for Multi-Centric Histopathology Datasets

no code implementations17 Aug 2020 Mathieu Andreux, Jean Ogier du Terrail, Constance Beguier, Eric W. Tramel

While federated learning is a promising approach for training deep learning models over distributed sensitive datasets, it presents new challenges for machine learning, especially when applied in the medical domain where multi-centric data heterogeneity is common.

Domain Adaptation Federated Learning +1

Efficient Sparse Secure Aggregation for Federated Learning

no code implementations29 Jul 2020 Constance Beguier, Mathieu Andreux, Eric W. Tramel

Federated Learning enables one to jointly train a machine learning model across distributed clients holding sensitive datasets.

Federated Learning

Efficient Per-Example Gradient Computations in Convolutional Neural Networks

1 code implementation12 Dec 2019 Gaspar Rochette, Andre Manoel, Eric W. Tramel

One notable application comes from the field of differential privacy, where per-example gradients must be norm-bounded in order to limit the impact of each example on the aggregated batch gradient.

ToxicBlend: Virtual Screening of Toxic Compounds with Ensemble Predictors

no code implementations12 Jun 2018 Mikhail Zaslavskiy, Simon Jégou, Eric W. Tramel, Gilles Wainrib

Timely assessment of compound toxicity is one of the biggest challenges facing the pharmaceutical industry today.

Drug Discovery

Robust Detection of Covariate-Treatment Interactions in Clinical Trials

no code implementations21 Dec 2017 Baptiste Goujaud, Eric W. Tramel, Pierre Courtiol, Mikhail Zaslavskiy, Gilles Wainrib

Detection of interactions between treatment effects and patient descriptors in clinical trials is critical for optimizing the drug development process.

Streaming Bayesian inference: theoretical limits and mini-batch approximate message-passing

no code implementations2 Jun 2017 Andre Manoel, Florent Krzakala, Eric W. Tramel, Lenka Zdeborová

In statistical learning for real-world large-scale data problems, one must often resort to "streaming" algorithms which operate sequentially on small batches of data.

Bayesian Inference

A Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines

no code implementations10 Feb 2017 Eric W. Tramel, Marylou Gabrié, Andre Manoel, Francesco Caltagirone, Florent Krzakala

Restricted Boltzmann machines (RBMs) are energy-based neural-networks which are commonly used as the building blocks for deep architectures neural architectures.

Denoising Latent Variable Models

Inferring Sparsity: Compressed Sensing using Generalized Restricted Boltzmann Machines

no code implementations13 Jun 2016 Eric W. Tramel, Andre Manoel, Francesco Caltagirone, Marylou Gabrié, Florent Krzakala

In this work, we consider compressed sensing reconstruction from $M$ measurements of $K$-sparse structured signals which do not possess a writable correlation model.

Training Restricted Boltzmann Machine via the Thouless-Anderson-Palmer free energy

no code implementations NeurIPS 2015 Marylou Gabrie, Eric W. Tramel, Florent Krzakala

Restricted Boltzmann machines are undirected neural networks which have been shown tobe effective in many applications, including serving as initializations fortraining deep multi-layer neural networks.

Training Restricted Boltzmann Machines via the Thouless-Anderson-Palmer Free Energy

no code implementations9 Jun 2015 Marylou Gabrié, Eric W. Tramel, Florent Krzakala

Restricted Boltzmann machines are undirected neural networks which have been shown to be effective in many applications, including serving as initializations for training deep multi-layer neural networks.

Approximate Message Passing with Restricted Boltzmann Machine Priors

no code implementations23 Feb 2015 Eric W. Tramel, Angélique Drémeau, Florent Krzakala

Approximate Message Passing (AMP) has been shown to be an excellent statistical approach to signal inference and compressed sensing problem.

Statistical Estimation: From Denoising to Sparse Regression and Hidden Cliques

no code implementations19 Sep 2014 Eric W. Tramel, Santhosh Kumar, Andrei Giurgiu, Andrea Montanari

These notes review six lectures given by Prof. Andrea Montanari on the topic of statistical estimation for linear models.

Denoising

Sparse Estimation with the Swept Approximated Message-Passing Algorithm

1 code implementation17 Jun 2014 Andre Manoel, Florent Krzakala, Eric W. Tramel, Lenka Zdeborová

Approximate Message Passing (AMP) has been shown to be a superior method for inference problems, such as the recovery of signals from sets of noisy, lower-dimensionality measurements, both in terms of reconstruction accuracy and in computational efficiency.

Cannot find the paper you are looking for? You can Submit a new open access paper.