CCG Supertagging

6 papers with code • 1 benchmarks • 2 datasets

Combinatory Categorical Grammar (CCG; Steedman, 2000) is a highly lexicalized formalism. The standard parsing model of Clark and Curran (2007) uses over 400 lexical categories (or supertags), compared to about 50 part-of-speech tags for typical parsers.

Example:

Vinken , 61 years old
N , N/N N (S[adj]\ NP)\ NP

Greatest papers with code

Semi-Supervised Sequence Modeling with Cross-View Training

tensorflow/models EMNLP 2018

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG Supertagging Dependency Parsing +6

Targeted Syntactic Evaluation of Language Models

yoavg/bert-syntax EMNLP 2018

We automatically construct a large number of minimally different pairs of English sentences, each consisting of a grammatical and an ungrammatical sentence.

CCG Supertagging Language Modelling

Supertagging Combinatory Categorial Grammar with Attentive Graph Convolutional Networks

cuhksz-nlp/NeST-CCG EMNLP 2020

Specifically, we build the graph from chunks (n-grams) extracted from a lexicon and apply attention over the graph, so that different word pairs from the contexts within and across chunks are weighted in the model and facilitate the supertagging accordingly.

CCG Supertagging

Exploring the Syntactic Abilities of RNNs with Multi-task Learning

emengd/multitask-agreement CONLL 2017

Recent work has explored the syntactic abilities of RNNs using the subject-verb agreement task, which diagnoses sensitivity to sentence structure.

CCG Supertagging Language Modelling +1

Keystroke dynamics as signal for shallow syntactic parsing

bplank/coling2016ks COLING 2016

Keystroke dynamics have been extensively used in psycholinguistic and writing research to gain insights into cognitive processing.

CCG Supertagging Chunking