Semantic Composition
18 papers with code • 0 benchmarks • 2 datasets
Understanding the meaning of text by composing the meanings of the individual words in the text (Source: https://arxiv.org/pdf/1405.7908.pdf)
Benchmarks
These leaderboards are used to track progress in Semantic Composition
Most implemented papers
The Lifted Matrix-Space Model for Semantic Composition
Tree-structured neural network architectures for sentence encoding draw inspiration from the approach to semantic composition generally seen in formal linguistics, and have shown empirical improvements over comparable sequence models by doing so.
SentiBERT: A Transferable Transformer-Based Architecture for Compositional Sentiment Semantics
We propose SentiBERT, a variant of BERT that effectively captures compositional sentiment semantics.
SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data
As the main discriminative information of a fine-grained image usually resides in subtle regions, methods along this line are prone to heavy label noise in fine-grained recognition.
Modeling Relation Paths for Representation Learning of Knowledge Bases
Representation learning of knowledge bases (KBs) aims to embed both entities and relations into a low-dimensional space.
A Semantically Compositional Annotation Scheme for Time Normalization
We present a new annotation scheme for normalizing time expressions, such as {``}three days ago{''}, to computer-readable forms, such as 2016-03-07.
Improving Sparse Word Representations with Distributional Inference for Semantic Composition
Distributional models are derived from co-occurrences in a corpus, where only a small proportion of all possible plausible co-occurrences will be observed.
Semantic Compositional Networks for Visual Captioning
The degree to which each member of the ensemble is used to generate an image caption is tied to the image-dependent probability of the corresponding tag.
Table Filling Multi-Task Recurrent Neural Network for Joint Entity and Relation Extraction
This paper proposes a novel context-aware joint entity and word-level relation extraction approach through semantic composition of words, introducing a Table Filling Multi-Task Recurrent Neural Network (TF-MTRNN) model that reduces the entity recognition and relation classification tasks to a table-filling problem and models their interdependencies.
Improving Semantic Composition with Offset Inference
Count-based distributional semantic models suffer from sparsity due to unobserved but plausible co-occurrences in any text collection.