Search Results for author: Daniel Pressel

Found 10 papers, 4 papers with code

Lightweight Transformers for Conversational AI

no code implementations NAACL (ACL) 2022 Daniel Pressel, Wenshuo Liu, Michael Johnston, Minhua Chen

To understand how training on conversational language impacts performance of pre-trained models on downstream dialogue tasks, we build compact Transformer-based Language Models from scratch on several large corpora of conversational data.

Intent Detection Natural Language Understanding

Multiple Word Embeddings for Increased Diversity of Representation

1 code implementation30 Sep 2020 Brian Lester, Daniel Pressel, Amy Hemmeter, Sagnik Ray Choudhury, Srinivas Bangalore

Most state-of-the-art models in natural language processing (NLP) are neural models built on top of large, pre-trained, contextual language models that generate representations of words in context and are fine-tuned for the task at hand.

Word Embeddings

Computationally Efficient NER Taggers with Combined Embeddings and Constrained Decoding

1 code implementation5 Jan 2020 Brian Lester, Daniel Pressel, Amy Hemmeter, Sagnik Ray Choudhury

The CRF layer is used to facilitate global coherence between labels, and the contextual embeddings provide a better representation of words in context.

named-entity-recognition Named Entity Recognition +2

Cannot find the paper you are looking for? You can Submit a new open access paper.