Dependency Parsing

214 papers with code • 12 benchmarks • 7 datasets

Dependency parsing is the task of extracting a dependency parse of a sentence that represents its grammatical structure and defines the relationships between "head" words and words, which modify those heads.

Example:

     root
      |
      | +-------dobj---------+
      | |                    |
nsubj | |   +------det-----+ | +-----nmod------+
+--+  | |   |              | | |               |
|  |  | |   |      +-nmod-+| | |      +-case-+ |
+  |  + |   +      +      || + |      +      | |
I  prefer  the  morning   flight  through  Denver

Relations among the words are illustrated above the sentence with directed, labeled arcs from heads to dependents (+ indicates the dependent).

Greatest papers with code

Semi-Supervised Sequence Modeling with Cross-View Training

tensorflow/models EMNLP 2018

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG Supertagging Dependency Parsing +5

DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks

tensorflow/models 13 Mar 2017

In this work, we present a compact, modular framework for constructing novel recurrent neural architectures.

Dependency Parsing Extractive Summarization +1

Globally Normalized Transition-Based Neural Networks

tensorflow/models ACL 2016

Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models.

Dependency Parsing Part-Of-Speech Tagging +1

Stanza: A Python Natural Language Processing Toolkit for Many Human Languages

stanfordnlp/stanza ACL 2020

We introduce Stanza, an open-source Python natural language processing toolkit supporting 66 human languages.

Coreference Resolution Dependency Parsing +4

N-LTP: A Open-source Neural Chinese Language Technology Platform with Pretrained Models

HIT-SCIR/ltp 24 Sep 2020

In addition, knowledge distillation where the single-task model teaches the multi-task model is further introduced to encourage the multi-task model to surpass its single-task teacher.

Chinese Word Segmentation Dependency Parsing +7

Deep Biaffine Attention for Neural Dependency Parsing

dmlc/gluon-nlp 6 Nov 2016

This paper builds off recent work from Kiperwasser & Goldberg (2016) using neural attention in a simple graph-based dependency parser.

Dependency Parsing

DisSent: Sentence Representation Learning from Explicit Discourse Relations

facebookresearch/InferSent 12 Oct 2017

Learning effective representations of sentences is one of the core missions of natural language understanding.

Dependency Parsing Natural Language Understanding +1

Towards Better UD Parsing: Deep Contextualized Word Embeddings, Ensemble, and Treebank Concatenation

HIT-SCIR/ELMoForManyLangs CONLL 2018

This paper describes our system (HIT-SCIR) submitted to the CoNLL 2018 shared task on Multilingual Parsing from Raw Text to Universal Dependencies.

Dependency Parsing Word Embeddings