Search Results for author: Daniel Rothchild

Found 8 papers, 3 papers with code

Communication-Efficient Federated Learning with Sketching

no code implementations ICML 2020 Daniel Rothchild, Ashwinee Panda, Enayat Ullah, Nikita Ivkin, Vladimir Braverman, Joseph Gonzalez, Ion Stoica, Raman Arora

A key insight in the design of FedSketchedSGD is that, because the Count Sketch is linear, momentum and error accumulation can both be carried out within the sketch.

Federated Learning

Investigating the Behavior of Diffusion Models for Accelerating Electronic Structure Calculations

no code implementations2 Nov 2023 Daniel Rothchild, Andrew S. Rosen, Eric Taw, Connie Robinson, Joseph E. Gonzalez, Aditi S. Krishnapriyan

We present an investigation into diffusion models for molecular generation, with the aim of better understanding how their predictions compare to the results of physics-based calculations.

C5T5: Controllable Generation of Organic Molecules with Transformers

1 code implementation23 Aug 2021 Daniel Rothchild, Alex Tamkin, Julie Yu, Ujval Misra, Joseph Gonzalez

Methods for designing organic materials with desired properties have high potential impact across fields such as medicine, renewable energy, petrochemical engineering, and agriculture.

Drug Discovery molecular representation

Carbon Emissions and Large Neural Network Training

no code implementations21 Apr 2021 David Patterson, Joseph Gonzalez, Quoc Le, Chen Liang, Lluis-Miquel Munguia, Daniel Rothchild, David So, Maud Texier, Jeff Dean

To help reduce the carbon footprint of ML, we believe energy usage and CO2e should be a key metric in evaluating models, and we are collaborating with MLPerf developers to include energy usage during training and inference in this industry standard benchmark.

Neural Architecture Search Scheduling

FetchSGD: Communication-Efficient Federated Learning with Sketching

no code implementations15 Jul 2020 Daniel Rothchild, Ashwinee Panda, Enayat Ullah, Nikita Ivkin, Ion Stoica, Vladimir Braverman, Joseph Gonzalez, Raman Arora

A key insight in the design of FetchSGD is that, because the Count Sketch is linear, momentum and error accumulation can both be carried out within the sketch.

Federated Learning

SqueezeWave: Extremely Lightweight Vocoders for On-device Speech Synthesis

1 code implementation16 Jan 2020 Bohan Zhai, Tianren Gao, Flora Xue, Daniel Rothchild, Bichen Wu, Joseph E. Gonzalez, Kurt Keutzer

Automatic speech synthesis is a challenging task that is becoming increasingly important as edge devices begin to interact with users through speech.

Sound Audio and Speech Processing

Communication-efficient distributed SGD with Sketching

2 code implementations NeurIPS 2019 Nikita Ivkin, Daniel Rothchild, Enayat Ullah, Vladimir Braverman, Ion Stoica, Raman Arora

Large-scale distributed training of neural networks is often limited by network bandwidth, wherein the communication time overwhelms the local computation time.

Cannot find the paper you are looking for? You can Submit a new open access paper.