Search Results for author: Akshay Budhkar

Found 4 papers, 2 papers with code

A Package for Learning on Tabular and Text Data with Transformers

1 code implementation NAACL (maiworkshop) 2021 Ken Gu, Akshay Budhkar

Recent progress in natural language processing has led to Transformer architectures becoming the predominant model used for natural language tasks.

Shrinking Bigfoot: Reducing wav2vec 2.0 footprint

no code implementations EMNLP (sustainlp) 2021 Zilun Peng, Akshay Budhkar, Ilana Tuil, Jason Levy, Parinaz Sobhani, Raphael Cohen, Jumana Nassour

Using a teacher-student approach, we distilled the knowledge from the original wav2vec 2. 0 model into a student model, which is 2 times faster and 4. 8 times smaller than the original model.

Model Compression speech-recognition +1

Generative Adversarial Networks for text using word2vec intermediaries

1 code implementation WS 2019 Akshay Budhkar, Krishnapriya Vishnubhotla, Safwan Hossain, Frank Rudzicz

Generative adversarial networks (GANs) have shown considerable success, especially in the realistic generation of images.

Word Embeddings

Augmenting word2vec with latent Dirichlet allocation within a clinical application

no code implementations NAACL 2019 Akshay Budhkar, Frank Rudzicz

This paper presents three hybrid models that directly combine latent Dirichlet allocation and word embedding for distinguishing between speakers with and without Alzheimer's disease from transcripts of picture descriptions.

Cannot find the paper you are looking for? You can Submit a new open access paper.