Search Results for author: Raul Puri

Found 11 papers, 7 papers with code

Local Knowledge Powered Conversational Agents

1 code implementation20 Oct 2020 Sashank Santhanam, Wei Ping, Raul Puri, Mohammad Shoeybi, Mostofa Patwary, Bryan Catanzaro

State-of-the-art conversational agents have advanced significantly in conjunction with the use of large transformer-based language models.


BioMegatron: Larger Biomedical Domain Language Model

1 code implementation EMNLP 2020 Hoo-chang Shin, Yang Zhang, Evelina Bakhturina, Raul Puri, Mostofa Patwary, Mohammad Shoeybi, Raghav Mani

There has been an influx of biomedical domain-specific language models, showing language models pre-trained on biomedical text perform better on biomedical domain benchmarks than those trained on general domain text corpora such as Wikipedia and Books.

Language Modelling named-entity-recognition +3

Large Scale Multi-Actor Generative Dialog Modeling

no code implementations ACL 2020 Alex Boyd, Raul Puri, Mohammad Shoeybi, Mostofa Patwary, Bryan Catanzaro

This work introduces the Generative Conversation Control model, an augmented and fine-tuned GPT-2 language model that conditions on past reference conversations to probabilistically model multi-turn conversations in the actor's persona.

Goal-Oriented Dialog Language Modelling

Training Question Answering Models From Synthetic Data

no code implementations EMNLP 2020 Raul Puri, Ryan Spring, Mostofa Patwary, Mohammad Shoeybi, Bryan Catanzaro

On the SQuAD1. 1 question answering task, we achieve higher accuracy using solely synthetic questions and answers than when using the SQuAD1. 1 training set questions alone.

Answer Generation Data Augmentation +1

Zero-shot Text Classification With Generative Language Models

no code implementations10 Dec 2019 Raul Puri, Bryan Catanzaro

This work investigates the use of natural language to enable zero-shot model adaptation to new tasks.

Classification General Classification +4

Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism

9 code implementations17 Sep 2019 Mohammad Shoeybi, Mostofa Patwary, Raul Puri, Patrick Legresley, Jared Casper, Bryan Catanzaro

To demonstrate that large language models can further advance the state of the art (SOTA), we train an 8. 3 billion parameter transformer language model similar to GPT-2 and a 3. 9 billion parameter model similar to BERT.

LAMBADA Language Modelling +1

Practical Text Classification With Large Pre-Trained Language Models

1 code implementation4 Dec 2018 Neel Kant, Raul Puri, Nikolai Yakovenko, Bryan Catanzaro

Multi-emotion sentiment classification is a natural language processing (NLP) problem with valuable use cases on real-world data.

Classification Emotion Classification +4

Large Scale Language Modeling: Converging on 40GB of Text in Four Hours

1 code implementation3 Aug 2018 Raul Puri, Robert Kirby, Nikolai Yakovenko, Bryan Catanzaro

We provide a learning rate schedule that allows our model to converge with a 32k batch size.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.