132 papers with code • 1 benchmarks • 2 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

A Structured Self-attentive Sentence Embedding

jadore801120/attention-is-all-you-need-pytorch 9 Mar 2017

This paper proposes a new model for extracting an interpretable sentence embedding by introducing self-attention.

Evaluation of sentence embeddings in downstream and linguistic probing tasks

allenai/bilm-tf 16 Jun 2018

Despite the fast developmental pace of new sentence embedding methods, it is still challenging to find comprehensive evaluations of these different techniques.

Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation

UKPLab/sentence-transformers EMNLP 2020

The training is based on the idea that a translated sentence should be mapped to the same location in the vector space as the original sentence.

Language-agnostic BERT Sentence Embedding

FreddeFrallan/Multilingual-CLIP ACL 2022

While BERT is an effective method for learning monolingual sentence embeddings for semantic similarity and embedding based transfer learning (Reimers and Gurevych, 2019), BERT based cross-lingual sentence embeddings have yet to be explored.

ColBERT: Using BERT Sentence Embedding in Parallel Neural Networks for Computational Humor

Moradnejad/ColBERT-Using-BERT-Sentence-Embedding-for-Humor-Detection 27 Apr 2020

The proposed technical method initiates by separating sentences of the given text and utilizing the BERT model to generate embeddings for each one.

Aligning Books and Movies: Towards Story-like Visual Explanations by Watching Movies and Reading Books

soskek/homemade_bookcorpus ICCV 2015

Books are a rich source of both fine-grained information, how a character, an object or a scene looks like, as well as high-level semantics, what someone is thinking, feeling and how these states evolve through a story.

Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks

facebookresearch/InferSent 15 Aug 2016

The analysis sheds light on the relative strengths of different sentence embedding methods with respect to these low level prediction tasks, and on the effect of the encoded vector's dimensionality on the resulting representations.

DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understanding

taoshen58/DiSAN 14 Sep 2017

Recurrent neural nets (RNN) and convolutional neural nets (CNN) are widely used on NLP tasks to capture the long-term and local dependencies, respectively.

SBERT-WK: A Sentence Embedding Method by Dissecting BERT-based Word Models

BinWang28/SBERT-WK-Sentence-Embedding 16 Feb 2020

Yet, it is an open problem to generate a high quality sentence representation from BERT-based word models.