no code implementations • 10 Sep 2024 • Sherry Yang, Simon Batzner, Ruiqi Gao, Muratahan Aykol, Alexander L. Gaunt, Brendan McMorrow, Danilo J. Rezende, Dale Schuurmans, Igor Mordatch, Ekin D. Cubuk
We confirm that GenMS is able to generate common crystal structures such as double perovskites, or spinels, solely from natural language input, and hence can form the foundation for more complex structure generation in near future.
no code implementations • 19 Oct 2022 • Gary Wang, Ekin D. Cubuk, Andrew Rosenberg, Shuyang Cheng, Ron J. Weiss, Bhuvana Ramabhadran, Pedro J. Moreno, Quoc V. Le, Daniel S. Park
Data augmentation is a ubiquitous technique used to provide robustness to automatic speech recognition (ASR) training.
Ranked #1 on Speech Recognition on CHiME-6 eval
Automatic Speech Recognition Automatic Speech Recognition (ASR) +2
no code implementations • 9 Mar 2022 • Manoj Kumar, Neil Houlsby, Nal Kalchbrenner, Ekin D. Cubuk
Perceptual distances between images, as measured in the space of pre-trained deep features, have outperformed prior low-level, pixel-based metrics on assessing perceptual similarity.
no code implementations • ICLR 2022 • Raphael Gontijo-Lopes, Yann Dauphin, Ekin D. Cubuk
Despite being able to capture a range of features of the data, high accuracy models trained with supervision tend to make similar predictions.
no code implementations • ICCV 2021 • Golnaz Ghiasi, Barret Zoph, Ekin D. Cubuk, Quoc V. Le, Tsung-Yi Lin
The results suggest self-training is a promising direction to aggregate labeled and unlabeled training data for learning general feature representations.
3 code implementations • NeurIPS 2021 • Irwan Bello, William Fedus, Xianzhi Du, Ekin D. Cubuk, Aravind Srinivas, Tsung-Yi Lin, Jonathon Shlens, Barret Zoph
Using improved training and scaling strategies, we design a family of ResNet architectures, ResNet-RS, which are 1. 7x - 2. 7x faster than EfficientNets on TPUs, while achieving similar accuracies on ImageNet.
5 code implementations • CVPR 2021 • Golnaz Ghiasi, Yin Cui, Aravind Srinivas, Rui Qian, Tsung-Yi Lin, Ekin D. Cubuk, Quoc V. Le, Barret Zoph
Our baseline model outperforms the LVIS 2020 Challenge winning entry by +3. 6 mask AP on rare categories.
Ranked #1 on Object Detection on PASCAL VOC 2007
no code implementations • 5 Dec 2020 • Gowoon Cheon, Lusann Yang, Kevin McCloskey, Evan J. Reed, Ekin D. Cubuk
We illustrate the usage of the dataset by training graph neural networks to predict structural relaxations from randomly generated structures.
1 code implementation • 17 Sep 2020 • Li Li, Stephan Hoyer, Ryan Pederson, Ruoxi Sun, Ekin D. Cubuk, Patrick Riley, Kieron Burke
Including prior knowledge is important for effective machine learning models in physics, and is usually achieved by explicitly adding loss terms or constraints on model architectures.
2 code implementations • NeurIPS 2020 • Barret Zoph, Golnaz Ghiasi, Tsung-Yi Lin, Yin Cui, Hanxiao Liu, Ekin D. Cubuk, Quoc V. Le
For example, on the COCO object detection dataset, pre-training benefits when we use one fifth of the labeled data, and hurts accuracy when we use all labeled data.
Ranked #1 on Semantic Segmentation on PASCAL VOC 2012 val
1 code implementation • ECCV 2020 • Liang-Chieh Chen, Raphael Gontijo Lopes, Bowen Cheng, Maxwell D. Collins, Ekin D. Cubuk, Barret Zoph, Hartwig Adam, Jonathon Shlens
We view this work as a notable step towards building a simple procedure to harness unlabeled video sequences and extra images to surpass state-of-the-art performance on core computer vision tasks.
1 code implementation • ICLR 2020 • David Berthelot, Nicholas Carlini, Ekin D. Cubuk, Alex Kurakin, Kihyuk Sohn, Han Zhang, Colin Raffel
We improve the recently-proposed ``MixMatch semi-supervised learning algorithm by introducing two new techniques: distribution alignment and augmentation anchoring.
no code implementations • 20 Feb 2020 • Raphael Gontijo-Lopes, Sylvia J. Smullin, Ekin D. Cubuk, Ethan Dyer
Though data augmentation has become a standard component of deep neural network training, the underlying mechanism behind the effectiveness of these techniques remains poorly understood.
26 code implementations • NeurIPS 2020 • Kihyuk Sohn, David Berthelot, Chun-Liang Li, Zizhao Zhang, Nicholas Carlini, Ekin D. Cubuk, Alex Kurakin, Han Zhang, Colin Raffel
Semi-supervised learning (SSL) provides an effective means of leveraging unlabeled data to improve a model's performance.
1 code implementation • 9 Dec 2019 • Samuel S. Schoenholz, Ekin D. Cubuk
We introduce JAX MD, a software package for performing differentiable physics simulations with a focus on molecular dynamics.
15 code implementations • ICLR 2020 • Dan Hendrycks, Norman Mu, Ekin D. Cubuk, Barret Zoph, Justin Gilmer, Balaji Lakshminarayanan
We propose AugMix, a data processing technique that is simple to implement, adds limited computational overhead, and helps models withstand unforeseen corruptions.
Ranked #1 on Out-of-Distribution Generalization on ImageNet-W
3 code implementations • 21 Nov 2019 • David Berthelot, Nicholas Carlini, Ekin D. Cubuk, Alex Kurakin, Kihyuk Sohn, Han Zhang, Colin Raffel
Distribution alignment encourages the marginal distribution of predictions on unlabeled data to be close to the marginal distribution of ground-truth labels.
16 code implementations • NeurIPS 2020 • Ekin D. Cubuk, Barret Zoph, Jonathon Shlens, Quoc V. Le
Additionally, due to the separate search phase, these approaches are unable to adjust the regularization strength based on model or dataset size.
Ranked #12 on Data Augmentation on ImageNet
no code implementations • 25 Sep 2019 • Samuel S. Schoenholz, Ekin D. Cubuk
In this work we bring the substantial advances in software that have taken place in machine learning to MD with JAX, M. D.
6 code implementations • ECCV 2020 • Barret Zoph, Ekin D. Cubuk, Golnaz Ghiasi, Tsung-Yi Lin, Jonathon Shlens, Quoc V. Le
Importantly, the best policy found on COCO may be transferred unchanged to other detection datasets and models to improve predictive accuracy.
Ranked #6 on Robust Object Detection on Cityscapes
no code implementations • NeurIPS 2019 • Dong Yin, Raphael Gontijo Lopes, Jonathon Shlens, Ekin D. Cubuk, Justin Gilmer
Achieving robustness to distributional shift is a longstanding and challenging goal of computer vision.
no code implementations • 8 Jun 2019 • Luke Metz, Niru Maheswaranathan, Jonathon Shlens, Jascha Sohl-Dickstein, Ekin D. Cubuk
State-of-the art vision models can achieve superhuman performance on image classification tasks when testing and training data come from the same distribution.
2 code implementations • 6 Jun 2019 • Raphael Gontijo Lopes, Dong Yin, Ben Poole, Justin Gilmer, Ekin D. Cubuk
Deploying machine learning systems in the real world requires both high accuracy on clean data and robustness to naturally occurring corruptions.
3 code implementations • CVPR 2019 • Ekin D. Cubuk, Barret Zoph, Dandelion Mane, Vijay Vasudevan, Quoc V. Le
In our implementation, we have designed a search space where a policy consists of many sub-policies, one of which is randomly chosen for each image in each mini-batch.
Ranked #13 on Domain Generalization on VizWiz-Classification
30 code implementations • 18 Apr 2019 • Daniel S. Park, William Chan, Yu Zhang, Chung-Cheng Chiu, Barret Zoph, Ekin D. Cubuk, Quoc V. Le
On LibriSpeech, we achieve 6. 8% WER on test-other without the use of a language model, and 5. 8% WER with shallow fusion with a language model.
Ranked #1 on Speech Recognition on Hub5'00 SwitchBoard
Automatic Speech Recognition Automatic Speech Recognition (ASR) +2
1 code implementation • 18 Aug 2018 • Paul Z. Hanakata, Ekin D. Cubuk, David K. Campbell, Harold S. Park
Making kirigami-inspired cuts into a sheet has been shown to be an effective way of designing stretchable materials with metamorphic properties where the 2D shape can transform into complex 3D shapes.
Computational Physics Disordered Systems and Neural Networks
33 code implementations • 24 May 2018 • Ekin D. Cubuk, Barret Zoph, Dandelion Mane, Vijay Vasudevan, Quoc V. Le
In our implementation, we have designed a search space where a policy consists of many sub-policies, one of which is randomly chosen for each image in each mini-batch.
Ranked #6 on Data Augmentation on ImageNet
7 code implementations • NeurIPS 2018 • Avital Oliver, Augustus Odena, Colin Raffel, Ekin D. Cubuk, Ian J. Goodfellow
However, we argue that these benchmarks fail to address many issues that these algorithms would face in real-world applications.
no code implementations • 4 Mar 2018 • Tristan A. Sharp, Spencer L. Thomas, Ekin D. Cubuk, Samuel S. Schoenholz, David J. Srolovitz, Andrea J. Liu
In polycrystalline materials, grain boundaries are sites of enhanced atomic motion, but the complexity of the atomic structures within a grain boundary network makes it difficult to link the structure and atomic dynamics.
Materials Science
no code implementations • ICLR 2018 • Ekin D. Cubuk, Barret Zoph, Samuel S. Schoenholz, Quoc V. Le
Finally, we study the effect of network architectures on adversarial sensitivity.