Multilingual Named Entity Recognition Using Pretrained Embeddings, Attention Mechanism and NCRF

WS 2019  ·  Anton A. Emelyanov, Ekaterina Artemova ·

In this paper we tackle multilingual named entity recognition task. We use the BERT Language Model as embeddings with bidirectional recurrent network, attention, and NCRF on the top. We apply multilingual BERT only as embedder without any fine-tuning. We test out model on the dataset of the BSNLP shared task, which consists of texts in Bulgarian, Czech, Polish and Russian languages.

PDF Abstract WS 2019 PDF WS 2019 Abstract


  Add Datasets introduced or used in this paper

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.