no code implementations • EMNLP 2021 • Chi Hu, Chenglong Wang, Xiangnan Ma, Xia Meng, Yinqiao Li, Tong Xiao, Jingbo Zhu, Changliang Li
This paper addresses the efficiency challenge of Neural Architecture Search (NAS) by formulating the task as a ranking problem.
no code implementations • COLING 2020 • Qiang Wang, Changliang Li, Yue Zhang, Tong Xiao, Jingbo Zhu
In this way, in addition to the topmost encoder layer (referred to as the primary view), we also incorporate an intermediate encoder layer as the auxiliary view.
1 code implementation • ACL 2020 • Bei Li, Hui Liu, Ziyang Wang, Yufan Jiang, Tong Xiao, Jingbo Zhu, Tongran Liu, Changliang Li
In encoder-decoder neural models, multiple encoders are in general used to represent the contextual information in addition to the individual sentence.
no code implementations • ACL 2020 • Yinqiao Li, Chi Hu, Yuhao Zhang, Nuo Xu, Yufan Jiang, Tong Xiao, Jingbo Zhu, Tongran Liu, Changliang Li
Neural architecture search (NAS) has advanced significantly in recent years but most NAS systems restrict search to learning architectures of a recurrent or convolutional cell.
no code implementations • 2 Feb 2020 • Jingdong Li, HUI ZHANG, Xueliang Zhang, Changliang Li
We show that our model is able to improve the performance of model, compared with existing convolutional recurrent networks.
no code implementations • WS 2019 • Xinze Guo, Chang Liu, Xiaolong Li, Yiran Wang, Guoliang Li, Feng Wang, Zhitao Xu, Liuyi Yang, Li Ma, Changliang Li
This paper describes the Kingsoft AI Lab{'}s submission to the WMT2019 news translation shared task.
2 code implementations • ACL 2019 • Qiang Wang, Bei Li, Tong Xiao, Jingbo Zhu, Changliang Li, Derek F. Wong, Lidia S. Chao
Transformer is the state-of-the-art model in recent machine translation evaluations.
no code implementations • WS 2019 • Meiling Wang, Min Xiao, Changliang Li, Yu Guo, Zhixin Zhao, Xiaonan Liu
Chinese idioms (Cheng Yu) have seen five thousand years{'} history and culture of China, meanwhile they contain large number of scientific achievement of ancient China.
no code implementations • 17 Apr 2019 • Jia Li, Xiao Sun, Xing Wei, Changliang Li, Jian-Hua Tao
In recent years, the generation of conversation content based on deep neural networks has attracted many researchers.
no code implementations • 17 Apr 2019 • Jia Li, Xing Wei, Guoqiang Yang, Xiao Sun, Changliang Li
A multiscale shared convolution structure is adopted in the discriminator network to further supervise training the generator.
no code implementations • EMNLP 2018 • Changliang Li, Liang Li, Ji Qi
In this work, we propose a novel self-attentive model with gate mechanism to fully utilize the semantic correlation between slot and intent.
no code implementations • WS 2018 • Changliang Li, Ji Qi
Chinese grammatical error diagnosis system is a very important tool, which can help Chinese learners automatically diagnose grammatical errors in many scenarios.
no code implementations • IJCNLP 2017 • Changliang Li, Cunliang Kong
Multi-choice question answering in exams is a typical QA task.
no code implementations • IJCNLP 2017 • Changliang Li, Xiuying Wang
To our best knowledge, this is the largest Chinese spoken dialogue corpus, as well as the first one with slot information.