Search Results for author: Sebastian Hofstatter

Found 3 papers, 1 papers with code

Improving Transformer-Kernel Ranking Model Using Conformer and Query Term Independence

no code implementations19 Apr 2021 Bhaskar Mitra, Sebastian Hofstatter, Hamed Zamani, Nick Craswell

The Transformer-Kernel (TK) model has demonstrated strong reranking performance on the TREC Deep Learning benchmark -- and can be considered to be an efficient (but slightly less effective) alternative to other Transformer-based architectures that employ (i) large-scale pretraining (high training cost), (ii) joint encoding of query and document (high inference cost), and (iii) larger number of Transformer layers (both high training and high inference costs).

Document Ranking Retrieval

Conformer-Kernel with Query Term Independence at TREC 2020 Deep Learning Track

no code implementations14 Nov 2020 Bhaskar Mitra, Sebastian Hofstatter, Hamed Zamani, Nick Craswell

We benchmark Conformer-Kernel models under the strict blind evaluation setting of the TREC 2020 Deep Learning track.

Retrieval

Conformer-Kernel with Query Term Independence for Document Retrieval

1 code implementation20 Jul 2020 Bhaskar Mitra, Sebastian Hofstatter, Hamed Zamani, Nick Craswell

In this work, we extend the TK architecture to the full retrieval setting by incorporating the query term independence assumption.

Retrieval

Cannot find the paper you are looking for? You can Submit a new open access paper.