Sentence Embeddings
219 papers with code • 0 benchmarks • 11 datasets
Benchmarks
These leaderboards are used to track progress in Sentence Embeddings
Libraries
Use these libraries to find Sentence Embeddings models and implementationsDatasets
Subtasks
Most implemented papers
Language-agnostic BERT Sentence Embedding
While BERT is an effective method for learning monolingual sentence embeddings for semantic similarity and embedding based transfer learning (Reimers and Gurevych, 2019), BERT based cross-lingual sentence embeddings have yet to be explored.
Unsupervised Learning of Sentence Embeddings using Compositional n-Gram Features
The recent tremendous success of unsupervised word embeddings in a multitude of applications raises the obvious question if similar methods could be derived to improve embeddings (i. e. semantic representations) of word sequences as well.
Learning Natural Language Inference with LSTM
On the SNLI corpus, our model achieves an accuracy of 86. 1%, outperforming the state of the art.
BioSentVec: creating sentence embeddings for biomedical texts
Sentence embeddings have become an essential part of today's natural language processing (NLP) systems, especially together advanced deep learning methods.
Simple and Effective Paraphrastic Similarity from Parallel Translations
We present a model and methodology for learning paraphrastic sentence embeddings directly from bitext, removing the time-consuming intermediate step of creating paraphrase corpora.
ColBERT: Using BERT Sentence Embedding in Parallel Neural Networks for Computational Humor
The proposed technical method initiates by separating sentences of the given text and utilizing the BERT model to generate embeddings for each one.
Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks
The analysis sheds light on the relative strengths of different sentence embedding methods with respect to these low level prediction tasks, and on the effect of the encoded vector's dimensionality on the resulting representations.
Dependency Sensitive Convolutional Neural Networks for Modeling Sentences and Documents
Moreover, unlike other CNN-based models that analyze sentences locally by sliding windows, our system captures both the dependency information within each sentence and relationships across sentences in the same document.
DisSent: Sentence Representation Learning from Explicit Discourse Relations
Learning effective representations of sentences is one of the core missions of natural language understanding.
Simple Unsupervised Keyphrase Extraction using Sentence Embeddings
EmbedRank achieves higher F-scores than graph-based state of the art systems on standard datasets and is suitable for real-time processing of large amounts of Web data.