Search Results for author: Ryan Spring

Found 7 papers, 4 papers with code

Training Question Answering Models From Synthetic Data

no code implementations EMNLP 2020 Raul Puri, Ryan Spring, Mostofa Patwary, Mohammad Shoeybi, Bryan Catanzaro

On the SQuAD1. 1 question answering task, we achieve higher accuracy using solely synthetic questions and answers than when using the SQuAD1. 1 training set questions alone.

Answer Generation Data Augmentation +1

Compressing Gradient Optimizers via Count-Sketches

1 code implementation1 Feb 2019 Ryan Spring, Anastasios Kyrillidis, Vijai Mohan, Anshumali Shrivastava

The problem is becoming more severe as deep learning models continue to grow larger in order to learn from complex, large-scale datasets.

Ultra Large-Scale Feature Selection using Count-Sketches

1 code implementation ICML 2018 Amirali Aghazadeh, Ryan Spring, Daniel LeJeune, Gautam Dasarathy, Anshumali Shrivastava, baraniuk

We demonstrate that MISSION accurately and efficiently performs feature selection on real-world, large-scale datasets with billions of dimensions.

BIG-bench Machine Learning feature selection

MISSION: Ultra Large-Scale Feature Selection using Count-Sketches

1 code implementation12 Jun 2018 Amirali Aghazadeh, Ryan Spring, Daniel LeJeune, Gautam Dasarathy, Anshumali Shrivastava, Richard G. Baraniuk

We demonstrate that MISSION accurately and efficiently performs feature selection on real-world, large-scale datasets with billions of dimensions.

BIG-bench Machine Learning feature selection

A New Unbiased and Efficient Class of LSH-Based Samplers and Estimators for Partition Function Computation in Log-Linear Models

1 code implementation15 Mar 2017 Ryan Spring, Anshumali Shrivastava

We propose a new sampling scheme and an unbiased estimator that estimates the partition function accurately in sub-linear time.

Scalable and Sustainable Deep Learning via Randomized Hashing

no code implementations26 Feb 2016 Ryan Spring, Anshumali Shrivastava

A unique property of the proposed hashing based back-propagation is that the updates are always sparse.

Cannot find the paper you are looking for? You can Submit a new open access paper.