no code implementations • 2 Jul 2022 • Aaron Chan, Shaoliang Nie, Liang Tan, Xiaochang Peng, Hamed Firooz, Maziar Sanjabi, Xiang Ren
Following how humans communicate, free-text rationales aim to use natural language to explain neural language model (LM) behavior.
1 code implementation • BigScience (ACL) 2022 • Aaron Chan, Maziar Sanjabi, Lambert Mathias, Liang Tan, Shaoliang Nie, Xiaochang Peng, Xiang Ren, Hamed Firooz
An extractive rationale explains a language model's (LM's) prediction on a given task instance by highlighting the text inputs that most influenced the prediction.
no code implementations • WS 2019 • Fan Yang, Xiaochang Peng, Gargi Ghosh, Reshef Shilon, Hao Ma, Eider Moore, Goran Predovic
Interactions among users on social network platforms are usually positive, constructive and insightful.
no code implementations • CL 2019 • Daniel Gildea, Giorgio Satta, Xiaochang Peng
Our algorithms are based on finding a tree decomposition of smallest width, relative to the vertex order, and then extracting one rule for each node in this structure.
no code implementations • CL 2019 • Hao Zhang, Richard Sproat, Axel H. Ng, Felix Stahlberg, Xiaochang Peng, Kyle Gorman, Brian Roark
One problem that has been somewhat resistant to effective machine learning solutions is text normalization for speech applications such as text-to-speech synthesis (TTS).
1 code implementation • ACL 2018 • Xiaochang Peng, Linfeng Song, Daniel Gildea, Giorgio Satta
In this paper, we present a sequence-to-sequence based approach for mapping natural language sentences to AMR semantic graphs.
no code implementations • CL 2018 • Daniel Gildea, Giorgio Satta, Xiaochang Peng
Motivated by the task of semantic parsing, we describe a transition system that generalizes standard transition-based dependency parsing techniques to generate a graph rather than a tree.
no code implementations • EACL 2017 • Xiaochang Peng, Chuan Wang, Daniel Gildea, Nianwen Xue
Neural attention models have achieved great success in different NLP tasks.
no code implementations • ACL 2017 • Linfeng Song, Xiaochang Peng, Yue Zhang, Zhiguo Wang, Daniel Gildea
This paper addresses the task of AMR-to-text generation by leveraging synchronous node replacement grammar.
no code implementations • EMNLP 2016 • Linfeng Song, Yue Zhang, Xiaochang Peng, Zhiguo Wang, Daniel Gildea
The task of AMR-to-text generation is to generate grammatical text that sustains the semantic meaning for a given AMR graph.
no code implementations • 21 Jul 2016 • Xiaochang Peng, Daniel Gildea
In this paper, we introduce a variation of the skip-gram model which jointly learns distributed word vector representations and their way of composing to form phrase embeddings.