Browse > Natural Language Processing > Dependency Parsing

Dependency Parsing

69 papers with code · Natural Language Processing

Dependency parsing is the task of extracting a dependency parse of a sentence that represents its grammatical structure and defines the relationships between "head" words and words, which modify those heads.

Example:

     root
      |
      | +-------dobj---------+
      | |                    |
nsubj | |   +------det-----+ | +-----nmod------+
+--+  | |   |              | | |               |
|  |  | |   |      +-nmod-+| | |      +-case-+ |
+  |  + |   +      +      || + |      +      | |
I  prefer  the  morning   flight  through  Denver

Relations among the words are illustrated above the sentence with directed, labeled arcs from heads to dependents (+ indicates the dependent).

State-of-the-art leaderboards

Greatest papers with code

DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks

13 Mar 2017tensorflow/models

In this work, we present a compact, modular framework for constructing novel recurrent neural architectures. Our basic module is a new generic unit, the Transition Based Recurrent Unit (TBRU).

DEPENDENCY PARSING MULTI-TASK LEARNING

Globally Normalized Transition-Based Neural Networks

ACL 2016 tensorflow/models

We introduce a globally normalized transition-based neural network model that achieves state-of-the-art part-of-speech tagging, dependency parsing and sentence compression results. Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models.

DEPENDENCY PARSING PART-OF-SPEECH TAGGING SENTENCE COMPRESSION

Semi-Supervised Sequence Modeling with Cross-View Training

EMNLP 2018 tensorflow/models

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data. On unlabeled examples, CVT teaches auxiliary prediction modules that see restricted views of the input (e.g., only part of a sentence) to match the predictions of the full model seeing the whole input.

CCG SUPERTAGGING DEPENDENCY PARSING MACHINE TRANSLATION MULTI-TASK LEARNING NAMED ENTITY RECOGNITION UNSUPERVISED REPRESENTATION LEARNING

Stack-Pointer Networks for Dependency Parsing

ACL 2018 XuezheMax/NeuroNLP2

We introduce a novel architecture for dependency parsing: \emph{stack-pointer networks} (\textbf{\textsc{StackPtr}}). Combining pointer networks~\citep{vinyals2015pointer} with an internal stack, the proposed model first reads and encodes the whole sentence, then builds the dependency tree top-down (from root-to-leaf) in a depth-first fashion.

DEPENDENCY PARSING

Deep Biaffine Attention for Neural Dependency Parsing

6 Nov 2016XuezheMax/NeuroNLP2

This paper builds off recent work from Kiperwasser & Goldberg (2016) using neural attention in a simple graph-based dependency parser. We use a larger but more thoroughly regularized parser than other recent BiLSTM-based approaches, with biaffine classifiers to predict arcs and labels.

DEPENDENCY PARSING

Transition-Based Dependency Parsing with Stack Long Short-Term Memory

IJCNLP 2015 elikip/bist-parser

We propose a technique for learning representations of parser states in transition-based dependency parsers. Our primary innovation is a new control structure for sequence-to-sequence neural networks---the stack LSTM.

TRANSITION-BASED DEPENDENCY PARSING

Improved Transition-Based Parsing by Modeling Characters instead of Words with LSTMs

4 Aug 2015clab/lstm-parser

We present extensions to a continuous-state dependency parsing method that makes it applicable to morphologically rich languages. Starting with a high-performance transition-based parser that uses long short-term memory (LSTM) recurrent neural networks to learn representations of the parser state, we replace lookup-based word representations with representations constructed from the orthographic representations of the words, also using LSTMs.

DEPENDENCY PARSING

SparseMAP: Differentiable Sparse Structured Inference

ICML 2018 mblondel/fenchel-young-losses

Structured prediction requires searching over a combinatorial number of structures. To tackle it, we introduce SparseMAP: a new method for sparse structured inference, and its natural loss function.

DEPENDENCY PARSING NATURAL LANGUAGE INFERENCE STRUCTURED PREDICTION