Transition-Based Dependency Parsing

12 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Latest papers with no code

Structured Sentiment Analysis as Transition-based Dependency Parsing

no code yet • 9 May 2023

Structured sentiment analysis (SSA) aims to automatically extract people's opinions from a text in natural language and adequately represent that information in a graph structure.

Greedy Transition-Based Dependency Parsing with Discrete and Continuous Supertag Features

no code yet • 9 Jul 2020

We study the effect of rich supertag features in greedy transition-based dependency parsing.

Transition-Based Dependency Parsing using Perceptron Learner

no code yet • 22 Jan 2020

Syntactic parsing using dependency structures has become a standard technique in natural language processing with many different parsing models, in particular data-driven models that can be trained on syntactically annotated corpora.

Vietnamese transition-based dependency parsing with supertag features

no code yet • 9 Nov 2019

In recent years, dependency parsing is a fascinating research topic and has a lot of applications in natural language processing.

Neural Syntactic Generative Models with Exact Marginalization

no code yet • NAACL 2018

We present neural syntactic generative models with exact marginalization that support both dependency parsing and language modeling.

Cache Transition Systems for Graph Parsing

no code yet • CL 2018

Motivated by the task of semantic parsing, we describe a transition system that generalizes standard transition-based dependency parsing techniques to generate a graph rather than a tree.

Incremental Graph-based Neural Dependency Parsing

no code yet • EMNLP 2017

Very recently, some studies on neural dependency parsers have shown advantage over the traditional ones on a wide variety of languages.

Stack-based Multi-layer Attention for Transition-based Dependency Parsing

no code yet • EMNLP 2017

Although sequence-to-sequence (seq2seq) network has achieved significant success in many NLP tasks such as machine translation and text summarization, simply applying this approach to transition-based dependency parsing cannot yield a comparable performance gain as in other state-of-the-art methods, such as stack-LSTM and head selection.