Search Results for author: Tianze Shi

Found 18 papers, 12 papers with code

Transition-based Bubble Parsing: Improvements on Coordination Structure Prediction

1 code implementation ACL 2021 Tianze Shi, Lillian Lee

We propose a transition-based bubble parser to perform coordination structure identification and dependency-based syntactic analysis simultaneously.

Diversity-Aware Batch Active Learning for Dependency Parsing

1 code implementation NAACL 2021 Tianze Shi, Adrian Benton, Igor Malioutov, Ozan İrsoy

While the predictive performance of modern statistical dependency parsers relies heavily on the availability of expensive expert-annotated treebank data, not all annotations contribute equally to the training of the parsers.

Active Learning Dependency Parsing +1

Learning Syntax from Naturally-Occurring Bracketings

1 code implementation NAACL 2021 Tianze Shi, Ozan İrsoy, Igor Malioutov, Lillian Lee

Naturally-occurring bracketings, such as answer fragments to natural language questions and hyperlinks on webpages, can reflect human syntactic intuition regarding phrasal boundaries.

Constituency Parsing

Semantic Role Labeling as Syntactic Dependency Parsing

1 code implementation EMNLP 2020 Tianze Shi, Igor Malioutov, Ozan İrsoy

We reduce the task of (span-based) PropBank-style semantic role labeling (SRL) to syntactic dependency parsing.

Dependency Parsing Semantic Role Labeling

Extracting Headless MWEs from Dependency Parse Trees: Parsing, Tagging, and Joint Modeling Approaches

1 code implementation ACL 2020 Tianze Shi, Lillian Lee

An interesting and frequent type of multi-word expression (MWE) is the headless MWE, for which there are no true internal syntactic dominance relations; examples include many named entities ("Wells Fargo") and dates ("July 5, 2020") as well as certain productive constructions ("blow for blow", "day after day").

Valency-Augmented Dependency Parsing

1 code implementation EMNLP 2018 Tianze Shi, Lillian Lee

We present a complete, automated, and efficient approach for utilizing valency analysis in making dependency parsing decisions.

Dependency Parsing

IncSQL: Training Incremental Text-to-SQL Parsers with Non-Deterministic Oracles

no code implementations13 Sep 2018 Tianze Shi, Kedar Tatwawadi, Kaushik Chakrabarti, Yi Mao, Oleksandr Polozov, Weizhu Chen

We present a sequence-to-action parsing approach for the natural language to SQL task that incrementally fills the slots of a SQL query with feasible actions from a pre-defined inventory.

Action Parsing Text-To-Sql

Global Transition-based Non-projective Dependency Parsing

1 code implementation ACL 2018 Carlos Gómez-Rodríguez, Tianze Shi, Lillian Lee

Shi, Huang, and Lee (2017) obtained state-of-the-art results for English and Chinese dependency parsing by combining dynamic-programming implementations of transition-based dependency parsers with a minimal set of bidirectional LSTM features.

Chinese Dependency Parsing Dependency Parsing

Improving Coverage and Runtime Complexity for Exact Inference in Non-Projective Transition-Based Dependency Parsers

1 code implementation NAACL 2018 Tianze Shi, Carlos Gómez-Rodríguez, Lillian Lee

We generalize Cohen, G\'omez-Rodr\'iguez, and Satta's (2011) parser to a family of non-projective transition-based dependency parsers allowing polynomial-time exact inference.

Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set

1 code implementation EMNLP 2017 Tianze Shi, Liang Huang, Lillian Lee

We first present a minimal feature set for transition-based dependency parsing, continuing a recent trend started by Kiperwasser and Goldberg (2016a) and Cross and Huang (2016a) of using bi-directional LSTM features.

Transition-Based Dependency Parsing

Linking GloVe with word2vec

no code implementations20 Nov 2014 Tianze Shi, Zhiyuan Liu

The Global Vectors for word representation (GloVe), introduced by Jeffrey Pennington et al. is reported to be an efficient and effective method for learning vector representations of words.

Cannot find the paper you are looking for? You can Submit a new open access paper.