Search Results for author: Aleksandra Piktus

Found 10 papers, 7 papers with code

The Web Is Your Oyster -- Knowledge-Intensive NLP against a Very Large Web Corpus

2 code implementations18 Dec 2021 Aleksandra Piktus, Fabio Petroni, Vladimir Karpukhin, Dmytro Okhonko, Samuel Broscheit, Gautier Izacard, Patrick Lewis, Barlas Oğuz, Edouard Grave, Wen-tau Yih, Sebastian Riedel

In order to address increasing demands of real-world applications, the research for knowledge-intensive NLP (KI-NLP) should advance by capturing the challenges of a truly open-domain environment: web-scale knowledge, lack of structure, inconsistent quality and noise.

Common Sense Reasoning

Domain-matched Pre-training Tasks for Dense Retrieval

1 code implementation28 Jul 2021 Barlas Oğuz, Kushal Lakhotia, Anchit Gupta, Patrick Lewis, Vladimir Karpukhin, Aleksandra Piktus, Xilun Chen, Sebastian Riedel, Wen-tau Yih, Sonal Gupta, Yashar Mehdad

Pre-training on larger datasets with ever increasing model size is now a proven recipe for increased performance across almost all NLP tasks.

 Ranked #1 on Passage Retrieval on Natural Questions (using extra training data)

Passage Retrieval

Generating Fact Checking Briefs

no code implementations EMNLP 2020 Angela Fan, Aleksandra Piktus, Fabio Petroni, Guillaume Wenzek, Marzieh Saeidi, Andreas Vlachos, Antoine Bordes, Sebastian Riedel

Fact checking at scale is difficult -- while the number of active fact checking websites is growing, it remains too small for the needs of the contemporary media ecosystem.

Fact Checking Question Answering

How Context Affects Language Models' Factual Predictions

no code implementations AKBC 2020 Fabio Petroni, Patrick Lewis, Aleksandra Piktus, Tim Rocktäschel, Yuxiang Wu, Alexander H. Miller, Sebastian Riedel

When pre-trained on large unsupervised textual corpora, language models are able to store and retrieve factual knowledge to some extent, making it possible to use them directly for zero-shot cloze-style question answering.

Information Retrieval Language Modelling +2

Cannot find the paper you are looking for? You can Submit a new open access paper.