Search Results for author: Jiangming Liu

Found 19 papers, 5 papers with code

Universal Discourse Representation Structure Parsing

no code implementations CL (ACL) 2021 Jiangming Liu, Shay B. Cohen, Mirella Lapata, Johan Bos

Abstract We consider the task of crosslingual semantic parsing in the style of Discourse Representation Theory (DRT) where knowledge from annotated corpora in a resource-rich language is transferred via bitext to guide learning in other languages.

Semantic Parsing

VFed-SSD: Towards Practical Vertical Federated Advertising

no code implementations31 May 2022 Wenjie Li, Qiaolin Xia, Junfeng Deng, Hao Cheng, Jiangming Liu, Kouying Xue, Yong Cheng, Shu-Tao Xia

As an emerging secure learning paradigm in lever-aging cross-agency private data, vertical federatedlearning (VFL) is expected to improve advertising models by enabling the joint learning of complementary user attributes privately owned by the advertiser and the publisher.

Federated Learning Knowledge Distillation +1

Text Generation from Discourse Representation Structures

no code implementations NAACL 2021 Jiangming Liu, Shay B. Cohen, Mirella Lapata

We propose neural models to generate text from formal meaning representations based on Discourse Representation Structures (DRSs).

Text Generation

Evaluating NLP Models via Contrast Sets

no code implementations1 Oct 2020 Matt Gardner, Yoav Artzi, Victoria Basmova, Jonathan Berant, Ben Bogin, Sihao Chen, Pradeep Dasigi, Dheeru Dua, Yanai Elazar, Ananth Gottumukkala, Nitish Gupta, Hanna Hajishirzi, Gabriel Ilharco, Daniel Khashabi, Kevin Lin, Jiangming Liu, Nelson F. Liu, Phoebe Mulcaire, Qiang Ning, Sameer Singh, Noah A. Smith, Sanjay Subramanian, Reut Tsarfaty, Eric Wallace, A. Zhang, Ben Zhou

Unfortunately, when a dataset has systematic gaps (e. g., annotation artifacts), these evaluations are misleading: a model can learn simple decision rules that perform well on the test set but do not capture a dataset's intended capabilities.

Reading Comprehension Sentiment Analysis

Dscorer: A Fast Evaluation Metric for Discourse Representation Structure Parsing

no code implementations ACL 2020 Jiangming Liu, Shay B. Cohen, Mirella Lapata

Discourse representation structures (DRSs) are scoped semantic representations for texts of arbitrary length.

DRTS Parsing with Structure-Aware Encoding and Decoding

no code implementations ACL 2020 Qiankun Fu, Yue Zhang, Jiangming Liu, Meishan Zhang

Discourse representation tree structure (DRTS) parsing is a novel semantic parsing task which has been concerned most recently.

Graph Attention Semantic Parsing

Multi-Step Inference for Reasoning Over Paragraphs

no code implementations EMNLP 2020 Jiangming Liu, Matt Gardner, Shay B. Cohen, Mirella Lapata

Complex reasoning over text requires understanding and chaining together free-form predicates and logical connectives.

Logical Reasoning

Discourse Representation Parsing for Sentences and Documents

no code implementations ACL 2019 Jiangming Liu, Shay B. Cohen, Mirella Lapata

We introduce a novel semantic parsing task based on Discourse Representation Theory (DRT; Kamp and Reyle 1993).

Semantic Parsing Sentence

Discourse Representation Structure Parsing

1 code implementation ACL 2018 Jiangming Liu, Shay B. Cohen, Mirella Lapata

We introduce an open-domain neural semantic parser which generates formal meaning representations in the style of Discourse Representation Theory (DRT; Kamp and Reyle 1993).

Question Answering Semantic Parsing

In-Order Transition-based Constituent Parsing

2 code implementations TACL 2017 Jiangming Liu, Yue Zhang

Both bottom-up and top-down strategies have been used for neural transition-based constituent parsing.

Encoder-Decoder Shift-Reduce Syntactic Parsing

1 code implementation WS 2017 Jiangming Liu, Yue Zhang

Starting from NMT, encoder-decoder neu- ral networks have been used for many NLP problems.

Dependency Parsing NMT

Attention Modeling for Targeted Sentiment

no code implementations EACL 2017 Jiangming Liu, Yue Zhang

However, they do not explicitly model the contribution of each word in a sentence with respect to targeted sentiment polarities.

General Classification Sentence +2

Shift-Reduce Constituent Parsing with Neural Lookahead Features

1 code implementation TACL 2017 Jiangming Liu, Yue Zhang

In particular, we build a bidirectional LSTM model, which leverages the full sentence information to predict the hierarchy of constituents that each word starts and ends.

Sentence

Cannot find the paper you are looking for? You can Submit a new open access paper.