CCG Supertagging

8 papers with code • 1 benchmarks • 2 datasets

Combinatory Categorical Grammar (CCG; Steedman, 2000) is a highly lexicalized formalism. The standard parsing model of Clark and Curran (2007) uses over 400 lexical categories (or supertags), compared to about 50 part-of-speech tags for typical parsers.

Example:

Vinken , 61 years old
N , N/N N (S[adj]\ NP)\ NP

Most implemented papers

Targeted Syntactic Evaluation of Language Models

BeckyMarvin/LM_syneval EMNLP 2018

We automatically construct a large number of minimally different pairs of English sentences, each consisting of a grammatical and an ungrammatical sentence.

Semi-Supervised Sequence Modeling with Cross-View Training

tensorflow/models EMNLP 2018

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

Hierarchically-Refined Label Attention Network for Sequence Labeling

Nealcly/LAN IJCNLP 2019

CRF has been used as a powerful model for statistical sequence labeling.

Something Old, Something New: Grammar-based CCG Parsing with Transformer Models

cqcl/lambeq 21 Sep 2021

This report describes the parsing problem for Combinatory Categorial Grammar (CCG), showing how a combination of Transformer-based neural models and a symbolic CCG grammar can lead to substantial gains over existing approaches.

Keystroke dynamics as signal for shallow syntactic parsing

bplank/coling2016ks COLING 2016

Keystroke dynamics have been extensively used in psycholinguistic and writing research to gain insights into cognitive processing.

Exploring the Syntactic Abilities of RNNs with Multi-task Learning

emengd/multitask-agreement CONLL 2017

Recent work has explored the syntactic abilities of RNNs using the subject-verb agreement task, which diagnoses sensitivity to sentence structure.

Supertagging Combinatory Categorial Grammar with Attentive Graph Convolutional Networks

cuhksz-nlp/NeST-CCG EMNLP 2020

Specifically, we build the graph from chunks (n-grams) extracted from a lexicon and apply attention over the graph, so that different word pairs from the contexts within and across chunks are weighted in the model and facilitate the supertagging accordingly.

Geometry-Aware Supertagging with Heterogeneous Dynamic Convolutions

konstantinoskokos/dynamic-graph-supertagging 23 Mar 2022

The syntactic categories of categorial grammar formalisms are structured units made of smaller, indivisible primitives, bound together by the underlying grammar's category formation rules.