Semantic Composition

20 papers with code • 0 benchmarks • 2 datasets

Understanding the meaning of text by composing the meanings of the individual words in the text (Source: https://arxiv.org/pdf/1405.7908.pdf)

Most implemented papers

Table Filling Multi-Task Recurrent Neural Network for Joint Entity and Relation Extraction

pgcool/TF-MTRNN COLING 2016

This paper proposes a novel context-aware joint entity and word-level relation extraction approach through semantic composition of words, introducing a Table Filling Multi-Task Recurrent Neural Network (TF-MTRNN) model that reduces the entity recognition and relation classification tasks to a table-filling problem and models their interdependencies.

Improving Semantic Composition with Offset Inference

tttthomasssss/acl2017 ACL 2017

Count-based distributional semantic models suffer from sparsity due to unobserved but plausible co-occurrences in any text collection.

Semantic Hilbert Space for Text Representation Learning

wabyking/qnn 26 Feb 2019

To address this issue, we propose a new framework that models different levels of semantic units (e. g. sememe, word, sentence, and semantic abstraction) on a single \textit{Semantic Hilbert Space}, which naturally admits a non-linear semantic composition by means of a complex-valued vector word representation.

No Word is an Island -- A Transformation Weighting Model for Semantic Composition

sfb833-a3/commix 11 Jul 2019

Composition models of distributional semantics are used to construct phrase representations from the representations of their words.

Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics

guyemerson/pixie ACL 2020

Functional Distributional Semantics provides a linguistically interpretable framework for distributional semantics, by representing the meaning of a word as a function (a binary classifier), instead of a vector.

Ontology-guided Semantic Composition for Zero-Shot Learning

China-UK-ZSL/Resources_for_KZSL 30 Jun 2020

Zero-shot learning (ZSL) is a popular research problem that aims at predicting for those classes that have never appeared in the training stage by utilizing the inter-class relationship with some side information.

Semantic Prediction: Which One Should Come First, Recognition or Prediction?

ais-bonn/pred_semantic 6 Oct 2021

The ultimate goal of video prediction is not forecasting future pixel-values given some previous frames.

Synthetic Dataset for Evaluating Complex Compositional Knowledge for Natural Language Inference

clulab/releases 11 Jul 2023

To this end, we modify the original texts using a set of phrases - modifiers that correspond to universal quantifiers, existential quantifiers, negation, and other concept modifiers in Natural Logic (NL) (MacCartney, 2009).