1 code implementation • 12 Dec 2019 • Gaspar Rochette, Andre Manoel, Eric W. Tramel
One notable application comes from the field of differential privacy, where per-example gradients must be norm-bounded in order to limit the impact of each example on the aggregated batch gradient.
1 code implementation • 17 Jun 2014 • Andre Manoel, Florent Krzakala, Eric W. Tramel, Lenka Zdeborová
Approximate Message Passing (AMP) has been shown to be a superior method for inference problems, such as the recovery of signals from sets of noisy, lower-dimensionality measurements, both in terms of reconstruction accuracy and in computational efficiency.
1 code implementation • 8 Jan 2021 • Constance Beguier, Jean Ogier du Terrail, Iqraa Meah, Mathieu Andreux, Eric W. Tramel
Since 2014, the NIH funded iDASH (integrating Data for Analysis, Anonymization, SHaring) National Center for Biomedical Computing has hosted yearly competitions on the topic of private computing for genomic data.
1 code implementation • 9 May 2023 • Enmao Diao, Eric W. Tramel, Jie Ding, Tao Zhang
Keyword Spotting (KWS) is a critical aspect of audio-based applications on mobile devices and virtual assistants.
no code implementations • 12 Jun 2018 • Mikhail Zaslavskiy, Simon Jégou, Eric W. Tramel, Gilles Wainrib
Timely assessment of compound toxicity is one of the biggest challenges facing the pharmaceutical industry today.
Ranked #4 on Drug Discovery on Tox21
no code implementations • ICLR 2018 • Pierre Courtiol, Eric W. Tramel, Marc Sanselme, Gilles Wainrib
Analysis of histopathology slides is a critical step for many diagnoses, and in particular in oncology where it defines the gold standard.
no code implementations • 21 Dec 2017 • Baptiste Goujaud, Eric W. Tramel, Pierre Courtiol, Mikhail Zaslavskiy, Gilles Wainrib
Detection of interactions between treatment effects and patient descriptors in clinical trials is critical for optimizing the drug development process.
no code implementations • 10 Feb 2017 • Eric W. Tramel, Marylou Gabrié, Andre Manoel, Francesco Caltagirone, Florent Krzakala
Restricted Boltzmann machines (RBMs) are energy-based neural-networks which are commonly used as the building blocks for deep architectures neural architectures.
no code implementations • 2 Jun 2017 • Andre Manoel, Florent Krzakala, Eric W. Tramel, Lenka Zdeborová
In statistical learning for real-world large-scale data problems, one must often resort to "streaming" algorithms which operate sequentially on small batches of data.
no code implementations • 13 Jun 2016 • Eric W. Tramel, Andre Manoel, Francesco Caltagirone, Marylou Gabrié, Florent Krzakala
In this work, we consider compressed sensing reconstruction from $M$ measurements of $K$-sparse structured signals which do not possess a writable correlation model.
no code implementations • 5 Oct 2015 • Boshra Rajaei, Eric W. Tramel, Sylvain Gigan, Florent Krzakala, Laurent Daudet
In this paper, the problem of compressive imaging is addressed using natural randomization by means of a multiply scattering medium.
no code implementations • 23 Feb 2015 • Eric W. Tramel, Angélique Drémeau, Florent Krzakala
Approximate Message Passing (AMP) has been shown to be an excellent statistical approach to signal inference and compressed sensing problem.
no code implementations • 9 Jun 2015 • Marylou Gabrié, Eric W. Tramel, Florent Krzakala
Restricted Boltzmann machines are undirected neural networks which have been shown to be effective in many applications, including serving as initializations for training deep multi-layer neural networks.
no code implementations • 19 Sep 2014 • Eric W. Tramel, Santhosh Kumar, Andrei Giurgiu, Andrea Montanari
These notes review six lectures given by Prof. Andrea Montanari on the topic of statistical estimation for linear models.
no code implementations • NeurIPS 2015 • Marylou Gabrie, Eric W. Tramel, Florent Krzakala
Restricted Boltzmann machines are undirected neural networks which have been shown tobe effective in many applications, including serving as initializations fortraining deep multi-layer neural networks.
no code implementations • 29 Jul 2020 • Constance Beguier, Mathieu Andreux, Eric W. Tramel
Federated Learning enables one to jointly train a machine learning model across distributed clients holding sensitive datasets.
no code implementations • 17 Aug 2020 • Mathieu Andreux, Jean Ogier du Terrail, Constance Beguier, Eric W. Tramel
While federated learning is a promising approach for training deep learning models over distributed sensitive datasets, it presents new challenges for machine learning, especially when applied in the medical domain where multi-centric data heterogeneity is common.