no code implementations • 17 May 2022 • Beiduo Chen, Wu Guo, Quan Liu, Kun Tao
Multilingual BERT (mBERT), a language model pre-trained on large multilingual corpora, has impressive zero-shot cross-lingual transfer capabilities and performs surprisingly well on zero-shot POS tagging and Named Entity Recognition (NER), as well as on cross-lingual model transfer.
1 code implementation • COLING 2020 • Siyu Long, Ran Wang, Kun Tao, Jiali Zeng, Xin-yu Dai
Machine reading comprehension (MRC) is the task that asks a machine to answer questions based on a given context.
no code implementations • 2 Apr 2020 • Ran Wang, Kun Tao, Dingjie Song, Zhilong Zhang, Xiao Ma, Xi'ao Su, Xin-yu Dai
Existing question answering systems can only predict answers without explicit reasoning processes, which hinder their explainability and make us overestimate their ability of understanding and reasoning over natural language.