Search Results for author: Luyu Gao

Found 11 papers, 6 papers with code

Long Document Re-ranking with Modular Re-ranker

no code implementations9 May 2022 Luyu Gao, Jamie Callan

In this paper, we propose instead to model full query-to-document interaction, leveraging the attention operation and modular Transformer re-ranker framework.

Document Ranking Re-Ranking

Tevatron: An Efficient and Flexible Toolkit for Dense Retrieval

1 code implementation11 Mar 2022 Luyu Gao, Xueguang Ma, Jimmy Lin, Jamie Callan

In this paper, we present Tevatron, a dense retrieval toolkit optimized for efficiency, flexibility, and code simplicity.

Condenser: a Pre-training Architecture for Dense Retrieval

1 code implementation EMNLP 2021 Luyu Gao, Jamie Callan

Pre-trained Transformer language models (LM) have become go-to text representation encoders.

Language Modelling

COIL: Revisit Exact Lexical Match in Information Retrieval with Contextualized Inverted List

1 code implementation NAACL 2021 Luyu Gao, Zhuyun Dai, Jamie Callan

Classical information retrieval systems such as BM25 rely on exact lexical match and carry out search efficiently with inverted list index.

Information Retrieval

Rethink Training of BERT Rerankers in Multi-Stage Retrieval Pipeline

1 code implementation21 Jan 2021 Luyu Gao, Zhuyun Dai, Jamie Callan

Pre-trained deep language models~(LM) have advanced the state-of-the-art of text retrieval.

Improving Target-side Lexical Transfer in Multilingual Neural Machine Translation

no code implementations Findings of the Association for Computational Linguistics 2020 Luyu Gao, Xinyi Wang, Graham Neubig

To improve the performance of Neural Machine Translation~(NMT) for low-resource languages~(LRL), one effective strategy is to leverage parallel data from a related high-resource language~(HRL).

Machine Translation Translation

Understanding BERT Rankers Under Distillation

no code implementations21 Jul 2020 Luyu Gao, Zhuyun Dai, Jamie Callan

Deep language models such as BERT pre-trained on large corpus have given a huge performance boost to the state-of-the-art information retrieval ranking systems.

Information Retrieval

Complementing Lexical Retrieval with Semantic Residual Embedding

no code implementations29 Apr 2020 Luyu Gao, Zhuyun Dai, Tongfei Chen, Zhen Fan, Benjamin Van Durme, Jamie Callan

This paper presents CLEAR, a retrieval model that seeks to complement classical lexical exact-match models such as BM25 with semantic matching signals from a neural embedding matching model.

Information Retrieval

Modularized Transfomer-based Ranking Framework

no code implementations EMNLP 2020 Luyu Gao, Zhuyun Dai, Jamie Callan

Recent innovations in Transformer-based ranking models have advanced the state-of-the-art in information retrieval.

Information Retrieval

Cannot find the paper you are looking for? You can Submit a new open access paper.