no code implementations • 4 Dec 2024 • Tim Vieira, Ben LeBrun, Mario Giulianelli, Juan Luis Gastaldi, Brian DuSell, John Terilla, Timothy J. O'Donnell, Ryan Cotterell
Modern language models are internally -- and mathematically -- distributions over token strings rather than \emph{character} strings, posing numerous challenges for programmers building user applications on top of them.
1 code implementation • 17 Jun 2024 • Eva Portelance, Siva Reddy, Timothy J. O'Donnell
Here, we argue that they are instead both contingent on a more general learning strategy for language acquisition: joint learning.
no code implementations • 7 Jun 2024 • Amanda Doucette, Ryan Cotterell, Morgan Sonderegger, Timothy J. O'Donnell
It has been claimed that within a language, morphologically irregular words are more likely to be phonotactically simple and morphologically regular words are more likely to be phonotactically complex.
no code implementations • 14 Feb 2023 • Kushal Arora, Timothy J. O'Donnell, Doina Precup, Jason Weston, Jackie C. K. Cheung
State-of-the-art language generation models can degenerate when applied to open-ended generation problems such as text completion, story generation, or dialog modeling.
no code implementations • ICLR 2022 • Benjamin LeBrun, Alessandro Sordoni, Timothy J. O'Donnell
To address this gap, we develop a controlled evaluation scheme which uses generative models trained on natural data as artificial languages from which we can exactly compute sequence probabilities.
1 code implementation • NeurIPS 2021 • Leon Bergen, Timothy J. O'Donnell, Dzmitry Bahdanau
Recent research suggests that systematic generalization in natural language understanding remains a challenge for state-of-the-art neural models such as Transformers and Graph Neural Networks.
no code implementations • ACL 2022 • Emily Goodwin, Siva Reddy, Timothy J. O'Donnell, Dzmitry Bahdanau
To test compositional generalization in semantic parsing, Keysers et al. (2020) introduced Compositional Freebase Queries (CFQ).
1 code implementation • EMNLP 2021 • Jacob Louis Hoover, Alessandro Sordoni, Wenyu Du, Timothy J. O'Donnell
Are pairs of words that tend to occur together also likely to stand in a linguistic dependency?
no code implementations • ACL 2022 • Michaela Socolof, Jackie Chi Kit Cheung, Michael Wagner, Timothy J. O'Donnell
Second, the non-canonical meanings of words in an idiom are contingent on the presence of other words in the idiom.
no code implementations • 14 Apr 2021 • Leon Bergen, Dzmitry Bahdanau, Timothy J. O'Donnell
We present a model that jointly learns the denotations of words together with their groundings using a truth-conditional semantics.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Shawn Tan, Yikang Shen, Timothy J. O'Donnell, Alessandro Sordoni, Aaron Courville
We model the recursive production property of context-free grammars for natural and synthetic languages.
1 code implementation • ACL 2020 • Wenyu Du, Zhouhan Lin, Yikang Shen, Timothy J. O'Donnell, Yoshua Bengio, Yue Zhang
It is commonly believed that knowledge of syntactic structure should improve language modeling.
1 code implementation • ACL 2020 • Emily Goodwin, Koustuv Sinha, Timothy J. O'Donnell
Recently, there has been much interest in the question of whether deep natural language understanding models exhibit systematicity; generalizing such that units like words make consistent contributions to the meaning of the sentences in which they appear.
3 code implementations • 12 Dec 2019 • Dzmitry Bahdanau, Harm de Vries, Timothy J. O'Donnell, Shikhar Murty, Philippe Beaudoin, Yoshua Bengio, Aaron Courville
In this work, we study how systematic the generalization of such models is, that is to which extent they are capable of handling novel combinations of known linguistic constructs.
1 code implementation • ACL 2019 • Shijie Wu, Ryan Cotterell, Timothy J. O'Donnell
We present a study of morphological irregularity.
1 code implementation • 31 Oct 2017 • Eva Portelance, Amelia Bruno, Daniel Harasim, Leon Bergen, Timothy J. O'Donnell
The following technical report presents a formal approach to probabilistic minimalist grammar parameter estimation.
no code implementations • 31 Oct 2017 • Daniel Harasim, Chris Bruno, Eva Portelance, Martin Rohrmeier, Timothy J. O'Donnell
This technical report presents a general framework for parsing a variety of grammar formalisms.