1 code implementation • ACL 2022 • Yu Bao, Hao Zhou, ShuJian Huang, Dongqi Wang, Lihua Qian, Xinyu Dai, Jiajun Chen, Lei LI
Recently, parallel text generation has received widespread attention due to its success in generation efficiency.
no code implementations • 9 Aug 2022 • Fan Hu, Dongqi Wang, Huazhen Huang, Yishen Hu, Peng Yin
Based on these findings, we utilized a monte carlo based reinforcement learning generative model to generate novel multi-property compounds with both in vitro and in vivo efficacy, thus bridging the gap between target-based and cell-based drug discovery.
2 code implementations • 13 Apr 2022 • Mingshuo Nie, Dongming Chen, Dongqi Wang
In this survey, we provide a comprehensive overview of RL and graph mining methods and generalize these methods to Graph Reinforcement Learning (GRL) as a unified formulation.
1 code implementation • 5 Apr 2022 • Yu Bao, Hao Zhou, ShuJian Huang, Dongqi Wang, Lihua Qian, Xinyu Dai, Jiajun Chen, Lei LI
Recently, parallel text generation has received widespread attention due to its success in generation efficiency.
no code implementations • 21 Feb 2022 • Dongqi Wang, Shengyu Zhang, Zhipeng Di, Xin Lin, Weihua Zhou, Fei Wu
A common problem in both pruning and distillation is to determine compressed architecture, i. e., the exact number of filters per layer and layer configuration, in order to preserve most of the original model capacity.
1 code implementation • 23 Sep 2021 • Dongqi Wang, Haoran Wei, Zhirui Zhang, ShuJian Huang, Jun Xie, Jiajun Chen
We study the problem of online learning with human feedback in the human-in-the-loop machine translation, in which the human translators revise the machine-generated translations and then the corrected translations are used to improve the neural machine translation (NMT) system.
no code implementations • 29 May 2021 • Fan Hu, Lei Wang, Yishen Hu, Dongqi Wang, Weijie Wang, Jianbing Jiang, Nan Li, Peng Yin
The identification of protein-ligand interaction plays a key role in biochemical research and drug discovery.
1 code implementation • NAACL 2021 • Yu Bao, ShuJian Huang, Tong Xiao, Dongqi Wang, Xinyu Dai, Jiajun Chen
Non-autoregressive Transformer is a promising text generation model.
Ranked #7 on
Machine Translation
on WMT2014 German-English