Search Results for author: Antti Virtanen

Found 2 papers, 1 papers with code

WikiBERT models: deep transfer learning for many languages

no code implementations NoDaLiDa 2021 Sampo Pyysalo, Jenna Kanerva, Antti Virtanen, Filip Ginter

In this paper, we introduce a simple, fully automated pipeline for creating language-specific BERT models from Wikipedia data and introduce 42 new such models, most for languages up to now lacking dedicated deep neural language models.

Transfer Learning

Multilingual is not enough: BERT for Finnish

1 code implementation15 Dec 2019 Antti Virtanen, Jenna Kanerva, Rami Ilo, Jouni Luoma, Juhani Luotolahti, Tapio Salakoski, Filip Ginter, Sampo Pyysalo

Deep learning-based language models pretrained on large unannotated text corpora have been demonstrated to allow efficient transfer learning for natural language processing, with recent approaches such as the transformer-based BERT model advancing the state of the art across a variety of tasks.

Dependency Parsing named-entity-recognition +4

Cannot find the paper you are looking for? You can Submit a new open access paper.