Semantic Composition
20 papers with code • 0 benchmarks • 2 datasets
Understanding the meaning of text by composing the meanings of the individual words in the text (Source: https://arxiv.org/pdf/1405.7908.pdf)
Benchmarks
These leaderboards are used to track progress in Semantic Composition
Most implemented papers
Table Filling Multi-Task Recurrent Neural Network for Joint Entity and Relation Extraction
This paper proposes a novel context-aware joint entity and word-level relation extraction approach through semantic composition of words, introducing a Table Filling Multi-Task Recurrent Neural Network (TF-MTRNN) model that reduces the entity recognition and relation classification tasks to a table-filling problem and models their interdependencies.
Improving Semantic Composition with Offset Inference
Count-based distributional semantic models suffer from sparsity due to unobserved but plausible co-occurrences in any text collection.
From Characters to Time Intervals: New Paradigms for Evaluation and Neural Parsing of Time Normalizations
This paper presents the first model for time normalization trained on the SCATE corpus.
Semantic Hilbert Space for Text Representation Learning
To address this issue, we propose a new framework that models different levels of semantic units (e. g. sememe, word, sentence, and semantic abstraction) on a single \textit{Semantic Hilbert Space}, which naturally admits a non-linear semantic composition by means of a complex-valued vector word representation.
No Word is an Island -- A Transformation Weighting Model for Semantic Composition
Composition models of distributional semantics are used to construct phrase representations from the representations of their words.
Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics
Functional Distributional Semantics provides a linguistically interpretable framework for distributional semantics, by representing the meaning of a word as a function (a binary classifier), instead of a vector.
Ontology-guided Semantic Composition for Zero-Shot Learning
Zero-shot learning (ZSL) is a popular research problem that aims at predicting for those classes that have never appeared in the training stage by utilizing the inter-class relationship with some side information.
Semantic Prediction: Which One Should Come First, Recognition or Prediction?
The ultimate goal of video prediction is not forecasting future pixel-values given some previous frames.
Synthetic Dataset for Evaluating Complex Compositional Knowledge for Natural Language Inference
To this end, we modify the original texts using a set of phrases - modifiers that correspond to universal quantifiers, existential quantifiers, negation, and other concept modifiers in Natural Logic (NL) (MacCartney, 2009).