no code implementations • 17 Aug 2022 • Emad Elwany, Allison Hegel, Marina Shah, Brendan Roof, Genevieve Peaslee, Quentin Rivet
Weak supervision has been applied to various Natural Language Understanding tasks in recent years.
no code implementations • 16 Jul 2021 • Allison Hegel, Marina Shah, Genevieve Peaslee, Brendan Roof, Emad Elwany
Large, pre-trained transformer models like BERT have achieved state-of-the-art results on document understanding tasks, but most implementations can only consider 512 tokens at a time.
1 code implementation • EMNLP 2020 • Allison Hegel, Sudha Rao, Asli Celikyilmaz, Bill Dolan
Existing language models excel at writing from scratch, but many real-world scenarios require rewriting an existing document to fit a set of constraints.