Sentence Embedding

59 papers with code • 0 benchmarks • 6 datasets

This task has no description! Would you like to contribute one?

Greatest papers with code

ColBERT: Using BERT Sentence Embedding for Humor Detection

huggingface/transformers 27 Apr 2020

In this paper, we propose a novel approach for detecting humor in short texts based on the general linguistic structure of humor.

Humor Detection Sentence Embedding

A Structured Self-attentive Sentence Embedding

facebookresearch/pytext 9 Mar 2017

This paper proposes a new model for extracting an interpretable sentence embedding by introducing self-attention.

General Classification Natural Language Inference +2

Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation

UKPLab/sentence-transformers EMNLP 2020

The training is based on the idea that a translated sentence should be mapped to the same location in the vector space as the original sentence.

Knowledge Distillation Sentence Embedding

Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks

facebookresearch/InferSent 15 Aug 2016

The analysis sheds light on the relative strengths of different sentence embedding methods with respect to these low level prediction tasks, and on the effect of the encoded vector's dimensionality on the resulting representations.

Sentence Embedding

Evaluation of sentence embeddings in downstream and linguistic probing tasks

allenai/bilm-tf 16 Jun 2018

Despite the fast developmental pace of new sentence embedding methods, it is still challenging to find comprehensive evaluations of these different techniques.

Language Modelling Sentence Embedding +1

On the Sentence Embeddings from Pre-trained Language Models

InsaneLife/dssm EMNLP 2020

Pre-trained contextual representations like BERT have achieved great success in natural language processing.

Language Modelling Semantic Similarity +2

Aligning Books and Movies: Towards Story-like Visual Explanations by Watching Movies and Reading Books

soskek/homemade_bookcorpus ICCV 2015

Books are a rich source of both fine-grained information, how a character, an object or a scene looks like, as well as high-level semantics, what someone is thinking, feeling and how these states evolve through a story.

Sentence Embedding

DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understanding

taoshen58/DiSAN 14 Sep 2017

Recurrent neural nets (RNN) and convolutional neural nets (CNN) are widely used on NLP tasks to capture the long-term and local dependencies, respectively.

Natural Language Inference Sentence Embedding