Search Results for author: Moshe Berchansky

Found 3 papers, 2 papers with code

Optimizing Retrieval-augmented Reader Models via Token Elimination

1 code implementation20 Oct 2023 Moshe Berchansky, Peter Izsak, Avi Caciularu, Ido Dagan, Moshe Wasserblat

Fusion-in-Decoder (FiD) is an effective retrieval-augmented language model applied across a variety of open-domain tasks, such as question answering, fact checking, etc.

Answer Generation Fact Checking +3

How to Train BERT with an Academic Budget

4 code implementations EMNLP 2021 Peter Izsak, Moshe Berchansky, Omer Levy

While large language models a la BERT are used ubiquitously in NLP, pretraining them is considered a luxury that only a few well-funded industry labs can afford.

Language Modelling Linguistic Acceptability +4

Cannot find the paper you are looking for? You can Submit a new open access paper.