About

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Greatest papers with code

Top-down Tree Long Short-Term Memory Networks

NAACL 2016 XingxingZhang/td-treelstm

Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have been successfully applied to a variety of sequence modeling tasks.

DEPENDENCY PARSING SENTENCE COMPLETION

Recurrent Memory Networks for Language Modeling

NAACL 2016 ketranm/RMN

In this paper, we propose Recurrent Memory Network (RMN), a novel RNN architecture, that not only amplifies the power of RNN but also facilitates our understanding of its internal functioning and allows us to discover underlying patterns in data.

LANGUAGE MODELLING SENTENCE COMPLETION

CODAH: An Adversarially Authored Question-Answer Dataset for Common Sense

8 Apr 2019Websail-NU/AQuA

To produce a more difficult dataset, we introduce a novel procedure for question acquisition in which workers author questions designed to target weaknesses of state-of-the-art neural question answering systems.

 Ranked #1 on Common Sense Reasoning on CODAH (using extra training data)

COMMON SENSE REASONING QUESTION ANSWERING SENTENCE COMPLETION

Learning Semantically and Additively Compositional Distributional Representations

ACL 2016 tianran/vecdcs

This paper connects a vector-based composition model to a formal semantics, the Dependency-based Compositional Semantics (DCS).

CLASSIFICATION RELATION CLASSIFICATION SENTENCE COMPLETION

GePpeTto Carves Italian into a Language Model

29 Apr 2020LoreDema/GePpeTto

We provide a thorough analysis of GePpeTto's quality by means of both an automatic and a human-based evaluation.

LANGUAGE MODELLING SENTENCE COMPLETION

A Deep Architecture for Semantic Matching with Multiple Positional Sentence Representations

26 Nov 2015jastfkjg/semantic-matching

Our model has several advantages: (1) By using Bi-LSTM, rich context of the whole sentence is leveraged to capture the contextualized local information in each positional sentence representation; (2) By matching with multiple positional sentence representations, it is flexible to aggregate different important contextualized local information in a sentence to support the matching; (3) Experiments on different tasks such as question answering and sentence completion demonstrate the superiority of our model.

INFORMATION RETRIEVAL QUESTION ANSWERING SENTENCE COMPLETION