Semantic Composition

18 papers with code • 0 benchmarks • 2 datasets

Understanding the meaning of text by composing the meanings of the individual words in the text (Source:

Most implemented papers

The Lifted Matrix-Space Model for Semantic Composition

NYU-MLL/spinn CONLL 2018

Tree-structured neural network architectures for sentence encoding draw inspiration from the approach to semantic composition generally seen in formal linguistics, and have shown empirical improvements over comparable sequence models by doing so.

SentiBERT: A Transferable Transformer-Based Architecture for Compositional Sentiment Semantics

WadeYin9712/SentiBERT ACL 2020

We propose SentiBERT, a variant of BERT that effectively captures compositional sentiment semantics.

SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data

Shaoli-Huang/SnapMix 9 Dec 2020

As the main discriminative information of a fine-grained image usually resides in subtle regions, methods along this line are prone to heavy label noise in fine-grained recognition.

Modeling Relation Paths for Representation Learning of Knowledge Bases

Mrlyk423/Relation_Extraction EMNLP 2015

Representation learning of knowledge bases (KBs) aims to embed both entities and relations into a low-dimensional space.

A Semantically Compositional Annotation Scheme for Time Normalization

bethard/timenorm LREC 2016

We present a new annotation scheme for normalizing time expressions, such as {``}three days ago{''}, to computer-readable forms, such as 2016-03-07.

Improving Sparse Word Representations with Distributional Inference for Semantic Composition

tttthomasssss/apt-toolkit EMNLP 2016

Distributional models are derived from co-occurrences in a corpus, where only a small proportion of all possible plausible co-occurrences will be observed.

Semantic Compositional Networks for Visual Captioning

zhegan27/Semantic_Compositional_Nets CVPR 2017

The degree to which each member of the ensemble is used to generate an image caption is tied to the image-dependent probability of the corresponding tag.

Table Filling Multi-Task Recurrent Neural Network for Joint Entity and Relation Extraction

pgcool/TF-MTRNN COLING 2016

This paper proposes a novel context-aware joint entity and word-level relation extraction approach through semantic composition of words, introducing a Table Filling Multi-Task Recurrent Neural Network (TF-MTRNN) model that reduces the entity recognition and relation classification tasks to a table-filling problem and models their interdependencies.

Improving Semantic Composition with Offset Inference

tttthomasssss/acl2017 ACL 2017

Count-based distributional semantic models suffer from sparsity due to unobserved but plausible co-occurrences in any text collection.