Search Results for author: Xiaochang Peng

Found 15 papers, 2 papers with code

FRAME: Evaluating Rationale-Label Consistency Metrics for Free-Text Rationales

no code implementations2 Jul 2022 Aaron Chan, Shaoliang Nie, Liang Tan, Xiaochang Peng, Hamed Firooz, Maziar Sanjabi, Xiang Ren

Following how humans communicate, free-text rationales aim to use natural language to explain neural language model (LM) behavior.

Hallucination Language Modelling +2

UNIREX: A Unified Learning Framework for Language Model Rationale Extraction

1 code implementation BigScience (ACL) 2022 Aaron Chan, Maziar Sanjabi, Lambert Mathias, Liang Tan, Shaoliang Nie, Xiaochang Peng, Xiang Ren, Hamed Firooz

An extractive rationale explains a language model's (LM's) prediction on a given task instance by highlighting the text inputs that most influenced the prediction.

Language Modelling text-classification +1

Ordered Tree Decomposition for HRG Rule Extraction

no code implementations CL 2019 Daniel Gildea, Giorgio Satta, Xiaochang Peng

Our algorithms are based on finding a tree decomposition of smallest width, relative to the vertex order, and then extracting one rule for each node in this structure.

Tree Decomposition

Neural Models of Text Normalization for Speech Applications

no code implementations CL 2019 Hao Zhang, Richard Sproat, Axel H. Ng, Felix Stahlberg, Xiaochang Peng, Kyle Gorman, Brian Roark

One problem that has been somewhat resistant to effective machine learning solutions is text normalization for speech applications such as text-to-speech synthesis (TTS).

BIG-bench Machine Learning Speech Synthesis +1

Sequence-to-sequence Models for Cache Transition Systems

1 code implementation ACL 2018 Xiaochang Peng, Linfeng Song, Daniel Gildea, Giorgio Satta

In this paper, we present a sequence-to-sequence based approach for mapping natural language sentences to AMR semantic graphs.

AMR Parsing Hard Attention +1

Cache Transition Systems for Graph Parsing

no code implementations CL 2018 Daniel Gildea, Giorgio Satta, Xiaochang Peng

Motivated by the task of semantic parsing, we describe a transition system that generalizes standard transition-based dependency parsing techniques to generate a graph rather than a tree.

Semantic Parsing Transition-Based Dependency Parsing +1

AMR-to-text generation as a Traveling Salesman Problem

no code implementations EMNLP 2016 Linfeng Song, Yue Zhang, Xiaochang Peng, Zhiguo Wang, Daniel Gildea

The task of AMR-to-text generation is to generate grammatical text that sustains the semantic meaning for a given AMR graph.

AMR-to-Text Generation Text Generation +2

Exploring phrase-compositionality in skip-gram models

no code implementations21 Jul 2016 Xiaochang Peng, Daniel Gildea

In this paper, we introduce a variation of the skip-gram model which jointly learns distributed word vector representations and their way of composing to form phrase embeddings.

Dependency Parsing

Cannot find the paper you are looking for? You can Submit a new open access paper.