Browse > Natural Language Processing > CCG Supertagging

CCG Supertagging

4 papers with code · Natural Language Processing

Combinatory Categorical Grammar (CCG; Steedman, 2000) is a highly lexicalized formalism. The standard parsing model of Clark and Curran (2007) uses over 400 lexical categories (or supertags), compared to about 50 part-of-speech tags for typical parsers.

Example:

Vinken , 61 years old
N , N/N N (S[adj]\ NP)\ NP

State-of-the-art leaderboards

Greatest papers with code

Semi-Supervised Sequence Modeling with Cross-View Training

EMNLP 2018 tensorflow/models

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG SUPERTAGGING DEPENDENCY PARSING MACHINE TRANSLATION MULTI-TASK LEARNING NAMED ENTITY RECOGNITION UNSUPERVISED REPRESENTATION LEARNING

Probing What Different NLP Tasks Teach Machines about Function Word Comprehension

SEMEVAL 2019 nyu-mll/jiant

Our results show that pretraining on language modeling performs the best on average across our probing tasks, supporting its widespread use for pretraining state-of-the-art NLP models, and CCG supertagging and NLI pretraining perform comparably.

CCG SUPERTAGGING LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE

Targeted Syntactic Evaluation of Language Models

EMNLP 2018 yoavg/bert-syntax

We automatically construct a large number of minimally different pairs of English sentences, each consisting of a grammatical and an ungrammatical sentence.

CCG SUPERTAGGING LANGUAGE MODELLING

Keystroke dynamics as signal for shallow syntactic parsing

COLING 2016 bplank/coling2016ks

Keystroke dynamics have been extensively used in psycholinguistic and writing research to gain insights into cognitive processing.

CCG SUPERTAGGING CHUNKING