1 code implementation • ACL 2022 • Yu Bao, Hao Zhou, ShuJian Huang, Dongqi Wang, Lihua Qian, Xinyu Dai, Jiajun Chen, Lei LI
Recently, parallel text generation has received widespread attention due to its success in generation efficiency.
no code implementations • 20 Feb 2023 • Jiasheng Ye, Zaixiang Zheng, Yu Bao, Lihua Qian, Mingxuan Wang
In this paper, we introduce DINOISER to facilitate diffusion models for sequence generation by manipulating noises.
no code implementations • 12 Sep 2022 • Wei zhang, Yu Bao
Deep Linear and Nonlinear learning methods have already been vital machine learning methods for investigating the hierarchical features such as functional connectivity in the human brain via functional Magnetic Resonance signals; however, there are three major shortcomings: 1).
no code implementations • 20 May 2022 • Wei zhang, Yu Bao
In this work, to efficiently help escape the stationary and saddle points, we propose, analyze, and generalize a stochastic strategy performed as an operator for a first-order gradient descent algorithm in order to increase the target accuracy and reduce time consumption.
no code implementations • 20 May 2022 • Wei zhang, Yu Bao
Moreover, the theoretical analyses indicate that DELMAR can converge to the unique fixed point and even enable the accurate approximation of original input as DNNs.
no code implementations • 20 May 2022 • Wei zhang, Yu Bao
At first, the proposed DEMAND employs a non-fully connected and multilayer-stacked architecture that is easier to be optimized compared with canonical DNNs; furthermore, due to the efficient architecture, training DEMAND can avoid overfitting and enables the recognition of individual/minor features based on a small dataset such as an individual data; finally, a novel rank estimator technique is introduced to tune all hyperparameters of DEMAND automatically.
1 code implementation • 5 Apr 2022 • Yu Bao, Hao Zhou, ShuJian Huang, Dongqi Wang, Lihua Qian, Xinyu Dai, Jiajun Chen, Lei LI
Recently, parallel text generation has received widespread attention due to its success in generation efficiency.
no code implementations • 15 Dec 2021 • Min-Gang Zhou, Xiao-Yu Cao, Yu-Shuo Lu, Yang Wang, Yu Bao, Zhao-Ying Jia, Yao Fu, Hua-Lei Yin, Zeng-Bing Chen
The information transmitted by the proposed game also broke the classical limit.
1 code implementation • NAACL 2021 • Yu Bao, ShuJian Huang, Tong Xiao, Dongqi Wang, Xinyu Dai, Jiajun Chen
Non-autoregressive Transformer is a promising text generation model.
Ranked #7 on
Machine Translation
on WMT2014 German-English
no code implementations • 1 Jan 2021 • Lihua Qian, Hao Zhou, Yu Bao, Mingxuan Wang, Lin Qiu, Weinan Zhang, Yong Yu, Lei LI
Although non-autoregressive models with one-iteration generation achieves remarkable inference speed-up, they still falls behind their autoregressive counterparts inprediction accuracy.
no code implementations • ACL 2021 • Lihua Qian, Hao Zhou, Yu Bao, Mingxuan Wang, Lin Qiu, Wei-Nan Zhang, Yong Yu, Lei LI
With GLM, we develop Glancing Transformer (GLAT) for machine translation.
Ranked #67 on
Machine Translation
on WMT2014 English-German
no code implementations • ACL 2020 • Jiahuan Li, Yu Bao, Shu-Jian Huang, Xin-yu Dai, Jia-Jun Chen
Definition generation, which aims to automatically generate dictionary definitions for words, has recently been proposed to assist the construction of dictionaries and help people understand unfamiliar texts.
no code implementations • 25 Nov 2019 • Yu Bao, Hao Zhou, Jiangtao Feng, Mingxuan Wang, Shu-Jian Huang, Jia-Jun Chen, Lei LI
Non-autoregressive models are promising on various text generation tasks.
no code implementations • 25 Sep 2019 • Yu Bao, Hao Zhou, Jiangtao Feng, Mingxuan Wang, ShuJian Huang, Jiajun Chen, Lei LI
However, position modeling of output words is an essential problem in non-autoregressive text generation.
1 code implementation • ACL 2019 • Yu Bao, Hao Zhou, Shu-Jian Huang, Lei LI, Lili Mou, Olga Vechtomova, Xin-yu Dai, Jia-Jun Chen
In this paper, we propose to generate sentences from disentangled syntactic and semantic spaces.