About

Benchmarks

No evaluation results yet. Help compare methods by submit evaluation metrics.

Datasets

Greatest papers with code

ColBERT: Using BERT Sentence Embedding for Humor Detection

27 Apr 2020huggingface/transformers

In this paper, we propose a novel approach for detecting humor in short texts based on the general linguistic structure of humor.

HUMOR DETECTION SENTENCE EMBEDDING

A Structured Self-attentive Sentence Embedding

9 Mar 2017facebookresearch/pytext

This paper proposes a new model for extracting an interpretable sentence embedding by introducing self-attention.

NATURAL LANGUAGE INFERENCE SENTENCE EMBEDDING SENTIMENT ANALYSIS

Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation

EMNLP 2020 UKPLab/sentence-transformers

The training is based on the idea that a translated sentence should be mapped to the same location in the vector space as the original sentence.

KNOWLEDGE DISTILLATION SENTENCE EMBEDDING

Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks

15 Aug 2016facebookresearch/InferSent

The analysis sheds light on the relative strengths of different sentence embedding methods with respect to these low level prediction tasks, and on the effect of the encoded vector's dimensionality on the resulting representations.

SENTENCE EMBEDDING

Evaluation of sentence embeddings in downstream and linguistic probing tasks

16 Jun 2018allenai/bilm-tf

Despite the fast developmental pace of new sentence embedding methods, it is still challenging to find comprehensive evaluations of these different techniques.

LANGUAGE MODELLING SENTENCE EMBEDDING WORD EMBEDDINGS

On the Sentence Embeddings from Pre-trained Language Models

EMNLP 2020 InsaneLife/dssm

Pre-trained contextual representations like BERT have achieved great success in natural language processing.

LANGUAGE MODELLING SEMANTIC SIMILARITY SEMANTIC TEXTUAL SIMILARITY SENTENCE EMBEDDING

Aligning Books and Movies: Towards Story-like Visual Explanations by Watching Movies and Reading Books

ICCV 2015 soskek/homemade_bookcorpus

Books are a rich source of both fine-grained information, how a character, an object or a scene looks like, as well as high-level semantics, what someone is thinking, feeling and how these states evolve through a story.

SENTENCE EMBEDDING

DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understanding

14 Sep 2017taoshen58/DiSAN

Recurrent neural nets (RNN) and convolutional neural nets (CNN) are widely used on NLP tasks to capture the long-term and local dependencies, respectively.

NATURAL LANGUAGE INFERENCE SENTENCE EMBEDDING

Hierarchical Attention: What Really Counts in Various NLP Tasks

10 Aug 2018Disiok/poetry-seq2seq

Ham achieves a state-of-the-art BLEU score of 0. 26 on Chinese poem generation task and a nearly 6. 5% averaged improvement compared with the existing machine reading comprehension models such as BIDAF and Match-LSTM.

MACHINE READING COMPREHENSION MACHINE TRANSLATION SENTENCE EMBEDDING TEXT GENERATION