Sentence-Embedding
102 papers with code • 1 benchmarks • 2 datasets
Benchmarks
These leaderboards are used to track progress in Sentence-Embedding
Trend | Dataset | Best Model | Paper | Code | Compare |
---|
Libraries
Use these libraries to find Sentence-Embedding models and implementationsMost implemented papers
A Structured Self-attentive Sentence Embedding
This paper proposes a new model for extracting an interpretable sentence embedding by introducing self-attention.
Evaluation of sentence embeddings in downstream and linguistic probing tasks
Despite the fast developmental pace of new sentence embedding methods, it is still challenging to find comprehensive evaluations of these different techniques.
Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation
The training is based on the idea that a translated sentence should be mapped to the same location in the vector space as the original sentence.
Language-agnostic BERT Sentence Embedding
While BERT is an effective method for learning monolingual sentence embeddings for semantic similarity and embedding based transfer learning (Reimers and Gurevych, 2019), BERT based cross-lingual sentence embeddings have yet to be explored.
TSDAE: Using Transformer-based Sequential Denoising Auto-Encoder for Unsupervised Sentence Embedding Learning
Learning sentence embeddings often requires a large amount of labeled data.
Aligning Books and Movies: Towards Story-like Visual Explanations by Watching Movies and Reading Books
Books are a rich source of both fine-grained information, how a character, an object or a scene looks like, as well as high-level semantics, what someone is thinking, feeling and how these states evolve through a story.
Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks
The analysis sheds light on the relative strengths of different sentence embedding methods with respect to these low level prediction tasks, and on the effect of the encoded vector's dimensionality on the resulting representations.
SBERT-WK: A Sentence Embedding Method by Dissecting BERT-based Word Models
Yet, it is an open problem to generate a high quality sentence representation from BERT-based word models.
ColBERT: Using BERT Sentence Embedding in Parallel Neural Networks for Computational Humor
The proposed technical method initiates by separating sentences of the given text and utilizing the BERT model to generate embeddings for each one.
On the Sentence Embeddings from Pre-trained Language Models
Pre-trained contextual representations like BERT have achieved great success in natural language processing.