Sentence Embedding
132 papers with code • 0 benchmarks • 7 datasets
Benchmarks
These leaderboards are used to track progress in Sentence Embedding
Libraries
Use these libraries to find Sentence Embedding models and implementationsMost implemented papers
On the Sentence Embeddings from Pre-trained Language Models
Pre-trained contextual representations like BERT have achieved great success in natural language processing.
Learning Semantic Sentence Embeddings using Sequential Pair-wise Discriminator
One way to ensure this is by adding constraints for true paraphrase embeddings to be close and unrelated paraphrase candidate sentence embeddings to be far.
Neural Sentence Embedding using Only In-domain Sentences for Out-of-domain Sentence Detection in Dialog Systems
Then we used domain-category analysis as an auxiliary task to train neural sentence embedding for OOD sentence detection.
Context Mover's Distance & Barycenters: Optimal Transport of Contexts for Building Representations
We present a framework for building unsupervised representations of entities and their compositions, where each entity is viewed as a probability distribution rather than a vector embedding.
Learning to Embed Sentences Using Attentive Recursive Trees
Sentence embedding is an effective feature representation for most deep learning-based NLP tasks.
Sentence Embedding Alignment for Lifelong Relation Extraction
We formulate such a challenging problem as lifelong relation extraction and investigate memory-efficient incremental learning methods without catastrophically forgetting knowledge learned from previous tasks.
Discovering the Compositional Structure of Vector Representations with Role Learning Networks
How can neural networks perform so well on compositional tasks even though they lack explicit compositional representations?
A Bilingual Generative Transformer for Semantic Sentence Embedding
Semantic sentence embedding models encode natural language sentences into vectors, such that closeness in embedding space indicates closeness in the semantics between the sentences.
ESimCSE: Enhanced Sample Building Method for Contrastive Learning of Unsupervised Sentence Embedding
Unsup-SimCSE takes dropout as a minimal data augmentation method, and passes the same input sentence to a pre-trained Transformer encoder (with dropout turned on) twice to obtain the two corresponding embeddings to build a positive pair.
Smoothed Contrastive Learning for Unsupervised Sentence Embedding
Contrastive learning has been gradually applied to learn high-quality unsupervised sentence embedding.