Language-agnostic BERT Sentence Embedding

3 Jul 2020Fangxiaoyu FengYinfei YangDaniel CerNaveen ArivazhaganWei Wang

We adapt multilingual BERT to produce language-agnostic sentence embeddings for 109 languages. %The state-of-the-art for numerous monolingual and multilingual NLP tasks is masked language model (MLM) pretraining followed by task specific fine-tuning... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper