Unsupervised Domain Adaptation of Contextualized Embeddings for Sequence Labeling

IJCNLP 2019 Xiaochuang HanJacob Eisenstein

Contextualized word embeddings such as ELMo and BERT provide a foundation for strong performance across a wide range of natural language processing tasks by pretraining on large corpora of unlabeled text. However, the applicability of this approach is unknown when the target domain varies substantially from the pretraining corpus... (read more)

PDF Abstract IJCNLP 2019 PDF IJCNLP 2019 Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper