1 code implementation • 24 Apr 2022 • Antonio Mallia, Joel Mackenzie, Torsten Suel, Nicola Tonellotto
Neural information retrieval architectures based on transformers such as BERT are able to significantly improve system effectiveness over traditional sparse models such as BM25.
1 code implementation • 24 Apr 2021 • Antonio Mallia, Omar Khattab, Nicola Tonellotto, Torsten Suel
Neural information retrieval systems typically use a cascading pipeline, in which a first-stage model retrieves a candidate set of documents and one or more subsequent stages re-rank this set using contextualized language models such as BERT.