Search Results for author: Arun Babu

Found 10 papers, 6 papers with code

data2vec: A General Framework for Self-supervised Learning in Speech, Vision and Language

9 code implementations Preprint 2022 Alexei Baevski, Wei-Ning Hsu, Qiantong Xu, Arun Babu, Jiatao Gu, Michael Auli

While the general idea of self-supervised learning is identical across modalities, the actual algorithms and objectives differ widely because they were developed with a single modality in mind.

Image Classification Linguistic Acceptability +5

Latency-Aware Neural Architecture Search with Multi-Objective Bayesian Optimization

no code implementations ICML Workshop AutoML 2021 David Eriksson, Pierce I-Jen Chuang, Samuel Daulton, Peng Xia, Akshat Shrivastava, Arun Babu, Shicong Zhao, Ahmed Aly, Ganesh Venkatesh, Maximilian Balandat

When tuning the architecture and hyperparameters of large machine learning models for on-device deployment, it is desirable to understand the optimal trade-offs between on-device latency and model accuracy.

Bayesian Optimization Natural Language Understanding +1

Span Pointer Networks for Non-Autoregressive Task-Oriented Semantic Parsing

no code implementations Findings (EMNLP) 2021 Akshat Shrivastava, Pierce Chuang, Arun Babu, Shrey Desai, Abhinav Arora, Alexander Zotov, Ahmed Aly

An effective recipe for building seq2seq, non-autoregressive, task-oriented parsers to map utterances to semantic frames proceeds in three steps: encoding an utterance $x$, predicting a frame's length |y|, and decoding a |y|-sized frame with utterance and ontology tokens.

Cross-Lingual Transfer Quantization +2

Non-Autoregressive Semantic Parsing for Compositional Task-Oriented Dialog

1 code implementation NAACL 2021 Arun Babu, Akshat Shrivastava, Armen Aghajanyan, Ahmed Aly, Angela Fan, Marjan Ghazvininejad

Semantic parsing using sequence-to-sequence models allows parsing of deeper representations compared to traditional word tagging based models.

Semantic Parsing

Lightweight Convolutional Representations for On-Device Natural Language Processing

no code implementations4 Feb 2020 Shrey Desai, Geoffrey Goh, Arun Babu, Ahmed Aly

The increasing computational and memory complexities of deep neural networks have made it difficult to deploy them on low-resource electronic devices (e. g., mobile phones, tablets, wearables).

Model Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.