# Semantic Composition

16 papers with code • 0 benchmarks • 2 datasets

Understanding the meaning of text by composing the meanings of the individual words in the text (Source: https://arxiv.org/pdf/1405.7908.pdf)

# SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data

9 Dec 2020

As the main discriminative information of a fine-grained image usually resides in subtle regions, methods along this line are prone to heavy label noise in fine-grained recognition.

3

# The Lifted Matrix-Space Model for Semantic Composition

Tree-structured neural network architectures for sentence encoding draw inspiration from the approach to semantic composition generally seen in formal linguistics, and have shown empirical improvements over comparable sequence models by doing so.

2

# SentiBERT: A Transferable Transformer-Based Architecture for Compositional Sentiment Semantics

We propose SentiBERT, a variant of BERT that effectively captures compositional sentiment semantics.

2

# Modeling Relation Paths for Representation Learning of Knowledge Bases

Representation learning of knowledge bases (KBs) aims to embed both entities and relations into a low-dimensional space.

1

1

# A Semantically Compositional Annotation Scheme for Time Normalization

We present a new annotation scheme for normalizing time expressions, such as {}three days ago{''}, to computer-readable forms, such as 2016-03-07.

1

# Improving Sparse Word Representations with Distributional Inference for Semantic Composition

Distributional models are derived from co-occurrences in a corpus, where only a small proportion of all possible plausible co-occurrences will be observed.

1

# Semantic Compositional Networks for Visual Captioning

The degree to which each member of the ensemble is used to generate an image caption is tied to the image-dependent probability of the corresponding tag.

1

# Table Filling Multi-Task Recurrent Neural Network for Joint Entity and Relation Extraction

This paper proposes a novel context-aware joint entity and word-level relation extraction approach through semantic composition of words, introducing a Table Filling Multi-Task Recurrent Neural Network (TF-MTRNN) model that reduces the entity recognition and relation classification tasks to a table-filling problem and models their interdependencies.

1

# Improving Semantic Composition with Offset Inference

Count-based distributional semantic models suffer from sparsity due to unobserved but plausible co-occurrences in any text collection.

1