Zero-Resource Cross-Domain Named Entity Recognition

WS 2020  ·  Zihan Liu, Genta Indra Winata, Pascale Fung ·

Existing models for cross-domain named entity recognition (NER) rely on numerous unlabeled corpus or labeled NER training data in target domains. However, collecting data for low-resource target domains is not only expensive but also time-consuming. Hence, we propose a cross-domain NER model that does not use any external resources. We first introduce a Multi-Task Learning (MTL) by adding a new objective function to detect whether tokens are named entities or not. We then introduce a framework called Mixture of Entity Experts (MoEE) to improve the robustness for zero-resource domain adaptation. Finally, experimental results show that our model outperforms strong unsupervised cross-domain sequence labeling models, and the performance of our model is close to that of the state-of-the-art model which leverages extensive resources.

PDF Abstract WS 2020 PDF WS 2020 Abstract

Datasets


  Add Datasets introduced or used in this paper
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Cross-Domain Named Entity Recognition CoNLL04 BiLSTM w/ MTL and MoEE F1 70.04 # 1

Methods


No methods listed for this paper. Add relevant methods here