Search Results for author: Mikhail Arkhipov

Found 5 papers, 3 papers with code

Neural Entity Linking: A Survey of Models Based on Deep Learning

no code implementations31 May 2020 Ozge Sevgili, Artem Shelmanov, Mikhail Arkhipov, Alexander Panchenko, Chris Biemann

This survey presents a comprehensive description of recent neural entity linking (EL) systems developed since 2015 as a result of the "deep learning revolution" in natural language processing.

Entity Embeddings Entity Linking

Tuning Multilingual Transformers for Language-Specific Named Entity Recognition

1 code implementation WS 2019 Mikhail Arkhipov, Maria Trofimova, Yuri Kuratov, Alexey Sorokin

Our paper addresses the problem of multilingual named entity recognition on the material of 4 languages: Russian, Bulgarian, Czech and Polish.

Multilingual Named Entity Recognition NER +1

Adaptation of Deep Bidirectional Multilingual Transformers for Russian Language

1 code implementation17 May 2019 Yuri Kuratov, Mikhail Arkhipov

This work shows that transfer learning from a multilingual model to monolingual model results in significant growth of performance on such tasks as reading comprehension, paraphrase detection, and sentiment analysis.

 Ranked #1 on Question Answering on SQuAD1.1 (Hardware Burden metric)

Natural Language Inference Paraphrase Identification +4

Cannot find the paper you are looking for? You can Submit a new open access paper.