1 code implementation • WS 2020 • Stephen Mayhew, Klinton Bicknell, Chris Brust, Bill McDowell, Will Monroe, Burr Settles
We present the task of Simultaneous Translation and Paraphrasing for Language Education (STAPLE).
1 code implementation • NAACL 2018 • Will Monroe, Jennifer Hu, Andrew Jong, Christopher Potts
Contextual influences on language often exhibit substantial cross-lingual regularities; for example, we are more verbose in situations that require finer distinctions.
no code implementations • 8 Mar 2018 • Shuqing Bian, Zhenpeng Deng, Fei Li, Will Monroe, Peng Shi, Zijun Sun, Wei Wu, Sikuang Wang, William Yang Wang, Arianna Yuan, Tianwei Zhang, Jiwei Li
For the best setting, the proposed system is able to identify scam ICO projects with 0. 83 precision.
1 code implementation • TACL 2017 • Will Monroe, Robert X. D. Hawkins, Noah D. Goodman, Christopher Potts
We present a model of pragmatic referring expression interpretation in a grounded communication task (identifying colors from descriptions) that draws upon predictions from two recurrent neural network classifiers, a speaker and a listener, unified by a recursive pragmatic reasoning framework.
no code implementations • 22 Feb 2017 • Jiwei Li, Will Monroe, Dan Jurafsky
We show that from such a set of subsystems, one can use reinforcement learning to build a system that tailors its output to different input contexts at test time.
no code implementations • 23 Jan 2017 • Jiwei Li, Will Monroe, Dan Jurafsky
We introduce a simple, general strategy to manipulate the behavior of a neural decoder that enables it to generate outputs that have specific properties of interest (e. g., sequences of a pre-specified length).
8 code implementations • EMNLP 2017 • Jiwei Li, Will Monroe, Tianlin Shi, Sébastien Jean, Alan Ritter, Dan Jurafsky
In this paper, drawing intuition from the Turing test, we propose using adversarial training for open-domain dialogue generation: the system is trained to produce sequences that are indistinguishable from human-generated dialogue utterances.
Ranked #1 on
Dialogue Generation
on Amazon-5
no code implementations • 24 Dec 2016 • Jiwei Li, Will Monroe, Dan Jurafsky
While neural networks have been successfully applied to many natural language processing tasks, they come at the cost of interpretability.
1 code implementation • 25 Nov 2016 • Jiwei Li, Will Monroe, Dan Jurafsky
We further propose a variation that is capable of automatically adjusting its diversity decoding rates for different inputs using reinforcement learning (RL).
1 code implementation • EMNLP 2016 • Will Monroe, Noah D. Goodman, Christopher Potts
The production of color language is essential for grounded language generation.
8 code implementations • EMNLP 2016 • Jiwei Li, Will Monroe, Alan Ritter, Michel Galley, Jianfeng Gao, Dan Jurafsky
Recent neural models of dialogue generation offer great promise for generating responses for conversational agents, but tend to be shortsighted, predicting utterances one at a time while ignoring their influence on future outcomes.
no code implementations • 23 Oct 2015 • Will Monroe, Christopher Potts
The Rational Speech Acts (RSA) model treats language use as a recursive process in which probabilistic speaker and listener agents reason about each other's intentions to enrich the literal semantics of their language along broadly Gricean lines.
no code implementations • IJCNLP 2015 • Angel Chang, Will Monroe, Manolis Savva, Christopher Potts, Christopher D. Manning
The ability to map descriptions of scenes to 3D geometric representations has many applications in areas such as art, education, and robotics.