Sentence Embeddings
219 papers with code • 0 benchmarks • 11 datasets
Benchmarks
These leaderboards are used to track progress in Sentence Embeddings
Libraries
Use these libraries to find Sentence Embeddings models and implementationsDatasets
Subtasks
Latest papers
Grammatical information in BERT sentence embeddings as two-dimensional arrays
Next, we show that various architectures can detect patterns in these two-dimensional reshaped sentence embeddings and successfully learn a model based on smaller amounts of simpler training data, which performs well on more complex test data.
How Far Can We Extract Diverse Perspectives from Large Language Models?
In this study, we investigate LLMs' capacity for generating diverse perspectives and rationales on subjective topics, such as social norms and argumentative texts.
Exploring Semi-supervised Hierarchical Stacked Encoder for Legal Judgement Prediction
Predicting the judgment of a legal case from its unannotated case facts is a challenging task.
BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings
Most recent studies employed large language models (LLMs) to learn sentence embeddings.
Sub-Sentence Encoder: Contrastive Learning of Propositional Semantic Representations
We introduce sub-sentence encoder, a contrastively-learned contextual embedding model for fine-grained semantic representation of text.
AdaSent: Efficient Domain-Adapted Sentence Embeddings for Few-Shot Classification
As a solution, we propose AdaSent, which decouples SEPT from DAPT by training a SEPT adapter on the base PLM.
Japanese SimCSE Technical Report
We report the development of Japanese SimCSE, Japanese sentence embedding models fine-tuned with SimCSE.
Bipartite Graph Pre-training for Unsupervised Extractive Summarization with Graph Convolutional Auto-Encoders
Pre-trained sentence representations are crucial for identifying significant sentences in unsupervised document extractive summarization.
This Reads Like That: Deep Learning for Interpretable Natural Language Processing
Prototype learning, a popular machine learning method designed for inherently interpretable decisions, leverages similarities to learned prototypes for classifying new data.
DistillCSE: Distilled Contrastive Learning for Sentence Embeddings
This paper proposes the DistillCSE framework, which performs contrastive learning under the self-training paradigm with knowledge distillation.