Search Results for author: Abhay Gupta

Found 5 papers, 2 papers with code

Sparse-IFT: Sparse Iso-FLOP Transformations for Maximizing Training Efficiency

2 code implementations21 Mar 2023 Vithursan Thangarasa, Shreyas Saxena, Abhay Gupta, Sean Lie

Recent research has focused on weight sparsity in neural network training to reduce FLOPs, aiming for improved efficiency (test accuracy w. r. t training FLOPs).

SPDF: Sparse Pre-training and Dense Fine-tuning for Large Language Models

no code implementations18 Mar 2023 Vithursan Thangarasa, Abhay Gupta, William Marshall, Tianda Li, Kevin Leong, Dennis Decoste, Sean Lie, Shreyas Saxena

In this work, we show the benefits of using unstructured weight sparsity to train only a subset of weights during pre-training (Sparse Pre-training) and then recover the representational capacity by allowing the zeroed weights to learn (Dense Fine-tuning).

Text Generation Text Summarization

RevBiFPN: The Fully Reversible Bidirectional Feature Pyramid Network

1 code implementation28 Jun 2022 Vitaliy Chiley, Vithursan Thangarasa, Abhay Gupta, Anshul Samar, Joel Hestness, Dennis Decoste

However, training them requires substantial accelerator memory for saving large, multi-resolution activations.

Ranked #310 on Image Classification on ImageNet (using extra training data)

General Classification Image Classification +2

Approximating Wisdom of Crowds using K-RBMs

no code implementations16 Nov 2016 Abhay Gupta

We propose a method to aggregate noisy labels collected from a crowd of workers or annotators.

Clustering

DAiSEE: Towards User Engagement Recognition in the Wild

no code implementations7 Sep 2016 Abhay Gupta, Arjun D'Cunha, Kamal Awasthi, Vineeth Balasubramanian

We introduce DAiSEE, the first multi-label video classification dataset comprising of 9068 video snippets captured from 112 users for recognizing the user affective states of boredom, confusion, engagement, and frustration in the wild.

General Classification Video Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.