Document Ranking

42 papers with code • 2 benchmarks • 6 datasets

Sort documents according to some criterion so that the "best" results appear early in the result list displayed to the user (Source: Wikipedia).


Use these libraries to find Document Ranking models and implementations
2 papers
2 papers

Most implemented papers

XLNet: Generalized Autoregressive Pretraining for Language Understanding

zihangdai/xlnet NeurIPS 2019

With the capability of modeling bidirectional contexts, denoising autoencoding based pretraining like BERT achieves better performance than pretraining approaches based on autoregressive language modeling.

CEDR: Contextualized Embeddings for Document Ranking

Georgetown-IR-Lab/cedr 15 Apr 2019

We call this joint approach CEDR (Contextualized Embeddings for Document Ranking).

Neural Vector Spaces for Unsupervised Information Retrieval

cvangysel/cuNVSM 9 Aug 2017

We propose the Neural Vector Space Model (NVSM), a method that learns representations of documents in an unsupervised manner for news article retrieval.

Context Attentive Document Ranking and Query Suggestion

wasiahmad/context_attentive_ir 5 Jun 2019

We present a context-aware neural ranking model to exploit users' on-task search activities and enhance retrieval performance.

ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT

stanford-futuredata/ColBERT 27 Apr 2020

ColBERT introduces a late interaction architecture that independently encodes the query and the document using BERT and then employs a cheap yet powerful interaction step that models their fine-grained similarity.

Learning deep structured semantic models for web search using clickthrough data

PaddlePaddle/PaddleRec CIKM 2013

The proposed deep structured semantic models are discriminatively trained by maximizing the conditional likelihood of the clicked documents given a query using the clickthrough data.

Multi-Stage Document Ranking with BERT

castorini/docTTTTTquery 31 Oct 2019

The advent of deep neural networks pre-trained via language modeling tasks has spurred a number of successful applications in natural language processing.

Document Ranking with a Pretrained Sequence-to-Sequence Model

castorini/pygaggle Findings of the Association for Computational Linguistics 2020

We investigate this observation further by varying target words to probe the model's use of latent knowledge.

Traditional IR rivals neural models on the MS MARCO Document Ranking Leaderboard

oaqa/FlexNeuART 15 Dec 2020

This short document describes a traditional IR system that achieved MRR@100 equal to 0. 298 on the MS MARCO Document Ranking leaderboard (on 2020-12-06).

Exploring Classic and Neural Lexical Translation Models for Information Retrieval: Interpretability, Effectiveness, and Efficiency Benefits

oaqa/FlexNeuART 12 Feb 2021

We study the utility of the lexical translation model (IBM Model 1) for English text retrieval, in particular, its neural variants that are trained end-to-end.