Search Results for author: Lucas Hosseini

Found 8 papers, 5 papers with code

The Faiss library

1 code implementation16 Jan 2024 Matthijs Douze, Alexandr Guzhva, Chengqi Deng, Jeff Johnson, Gergely Szilvasy, Pierre-Emmanuel Mazaré, Maria Lomeli, Lucas Hosseini, Hervé Jégou

The Faiss library is dedicated to vector similarity search, a core functionality of vector databases.

Unsupervised Dense Information Retrieval with Contrastive Learning

6 code implementations16 Dec 2021 Gautier Izacard, Mathilde Caron, Lucas Hosseini, Sebastian Riedel, Piotr Bojanowski, Armand Joulin, Edouard Grave

In this work, we explore the limits of contrastive learning as a way to train unsupervised dense retrievers and show that it leads to strong performance in various retrieval settings.

Contrastive Learning Cross-Lingual Transfer +4

A Memory Efficient Baseline for Open Domain Question Answering

1 code implementation30 Dec 2020 Gautier Izacard, Fabio Petroni, Lucas Hosseini, Nicola De Cao, Sebastian Riedel, Edouard Grave

Recently, retrieval systems based on dense representations have led to important improvements in open-domain question answering, and related tasks.

Dimensionality Reduction Open-Domain Question Answering +3

Atlas: Few-shot Learning with Retrieval Augmented Language Models

1 code implementation5 Aug 2022 Gautier Izacard, Patrick Lewis, Maria Lomeli, Lucas Hosseini, Fabio Petroni, Timo Schick, Jane Dwivedi-Yu, Armand Joulin, Sebastian Riedel, Edouard Grave

Retrieval augmented models are known to excel at knowledge intensive tasks without the need for as many parameters, but it is unclear whether they work in few-shot settings.

Fact Checking Few-Shot Learning +6

Improving Wikipedia Verifiability with AI

1 code implementation8 Jul 2022 Fabio Petroni, Samuel Broscheit, Aleksandra Piktus, Patrick Lewis, Gautier Izacard, Lucas Hosseini, Jane Dwivedi-Yu, Maria Lomeli, Timo Schick, Pierre-Emmanuel Mazaré, Armand Joulin, Edouard Grave, Sebastian Riedel

Hence, maintaining and improving the quality of Wikipedia references is an important challenge and there is a pressing need for better tools to assist humans in this effort.

Citation Recommendation Fact Checking

Contrastive Pre-training for Zero-Shot Information Retrieval

no code implementations29 Sep 2021 Gautier Izacard, Mathilde Caron, Lucas Hosseini, Sebastian Riedel, Piotr Bojanowski, Armand Joulin, Edouard Grave

By contrast, in many other NLP tasks, conventional self-supervised pre-training based on masking leads to strong generalization with small number of training examples.

Contrastive Learning Fact Checking +3

Cannot find the paper you are looking for? You can Submit a new open access paper.