Search Results for author: Aaron Mueller

Found 11 papers, 5 papers with code

Causal Analysis of Syntactic Agreement Mechanisms in Neural Language Models

1 code implementation ACL 2021 Matthew Finlayson, Aaron Mueller, Sebastian Gehrmann, Stuart Shieber, Tal Linzen, Yonatan Belinkov

Targeted syntactic evaluations have demonstrated the ability of language models to perform subject-verb agreement given difficult contexts.

Fine-tuning Encoders for Improved Monolingual and Zero-shot Polylingual Neural Topic Modeling

1 code implementation NAACL 2021 Aaron Mueller, Mark Dredze

Neural topic models can augment or replace bag-of-words inputs with the learned representations of deep pre-trained transformer-based word prediction models.

Classification Cross-Lingual Transfer +3

Decoding Methods for Neural Narrative Generation

1 code implementation14 Oct 2020 Alexandra DeLucia, Aaron Mueller, Xiang Lisa Li, João Sedoc

Narrative generation is an open-ended NLP task in which a model generates a story given a prompt.

Demographic Representation and Collective Storytelling in the Me Too Twitter Hashtag Activism Movement

no code implementations13 Oct 2020 Aaron Mueller, Zach Wood-Doughty, Silvio Amir, Mark Dredze, Alicia L. Nobles

The #MeToo movement on Twitter has drawn attention to the pervasive nature of sexual harassment and violence.

An Analysis of Massively Multilingual Neural Machine Translation for Low-Resource Languages

no code implementations LREC 2020 Aaron Mueller, Garrett Nicolai, Arya D. McCarthy, Dylan Lewis, Winston Wu, David Yarowsky

We find that best practices in this domain are highly language-specific: adding more languages to a training set is often better, but too many harms performance{---}the best number depends on the source language.

Low-Resource Neural Machine Translation Translation

Fine-grained Morphosyntactic Analysis and Generation Tools for More Than One Thousand Languages

no code implementations LREC 2020 Garrett Nicolai, Dylan Lewis, Arya D. McCarthy, Aaron Mueller, Winston Wu, David Yarowsky

Exploiting the broad translation of the Bible into the world{'}s languages, we train and distribute morphosyntactic tools for approximately one thousand languages, vastly outstripping previous distributions of tools devoted to the processing of inflectional morphology.

Translation

Modeling Color Terminology Across Thousands of Languages

1 code implementation IJCNLP 2019 Arya D. McCarthy, Winston Wu, Aaron Mueller, Bill Watson, David Yarowsky

There is an extensive history of scholarship into what constitutes a "basic" color term, as well as a broadly attested acquisition sequence of basic color terms across many languages, as articulated in the seminal work of Berlin and Kay (1969).

Quantity doesn't buy quality syntax with neural language models

no code implementations IJCNLP 2019 Marten van Schijndel, Aaron Mueller, Tal Linzen

We investigate to what extent these shortcomings can be mitigated by increasing the size of the network and the corpus on which it is trained.

Cannot find the paper you are looking for? You can Submit a new open access paper.