1 code implementation • CVPR 2022 • Jinyu Chen, Chen Gao, Erli Meng, Qiong Zhang, Si Liu
However, the crucial navigation clues (i. e., object-level environment layout) for embodied navigation task is discarded since the maintained vector is essentially unstructured.
no code implementations • COLING 2020 • Xue Mengge, Bowen Yu, Tingwen Liu, Yue Zhang, Erli Meng, Bin Wang
Incorporating lexicons into character-level Chinese NER by lattices is proven effective to exploitrich word boundary information.
no code implementations • NAACL 2021 • Zhen Ke, Liang Shi, Songtao Sun, Erli Meng, Bin Wang, Xipeng Qiu
Recent researches show that pre-trained models (PTMs) are beneficial to Chinese Word Segmentation (CWS).
no code implementations • 13 Apr 2020 • Zhen Ke, Liang Shi, Erli Meng, Bin Wang, Xipeng Qiu, Xuanjing Huang
Besides, the pre-trained BERT language model has been also introduced into the MCCWS task in a multi-task learning framework.