no code implementations • 2 Jan 2021 • Wei Zhu, Daniel Cheung
In this work, we represent Lex-BERT, which incorporates the lexicon information into Chinese BERT for named entity recognition (NER) tasks in a natural manner.
no code implementations • 29 Dec 2020 • Wei Zhu, Daniel Cheung
In this work, we represent CMV-BERT, which improves the pretraining of a language model via two ingredients: (a) contrastive learning, which is well studied in the area of computer vision; (b) multiple vocabularies, one of which is fine-grained and the other is coarse-grained.