Search Results for author: Yu Bao

Found 15 papers, 4 papers with code

GLAT: Glancing at Latent Variables for Parallel Text Generation

1 code implementation ACL 2022 Yu Bao, Hao Zhou, ShuJian Huang, Dongqi Wang, Lihua Qian, Xinyu Dai, Jiajun Chen, Lei LI

Recently, parallel text generation has received widespread attention due to its success in generation efficiency.

Text Generation

DINOISER: Diffused Conditional Sequence Learning by Manipulating Noises

no code implementations20 Feb 2023 Jiasheng Ye, Zaixiang Zheng, Yu Bao, Lihua Qian, Mingxuan Wang

In this paper, we introduce DINOISER to facilitate diffusion models for sequence generation by manipulating noises.

SENDER: SEmi-Nonlinear Deep Efficient Reconstructor for Extraction Canonical, Meta, and Sub Functional Connectivity in the Human Brain

no code implementations12 Sep 2022 Wei zhang, Yu Bao

Deep Linear and Nonlinear learning methods have already been vital machine learning methods for investigating the hierarchical features such as functional connectivity in the human brain via functional Magnetic Resonance signals; however, there are three major shortcomings: 1).

SADAM: Stochastic Adam, A Stochastic Operator for First-Order Gradient-based Optimizer

no code implementations20 May 2022 Wei zhang, Yu Bao

In this work, to efficiently help escape the stationary and saddle points, we propose, analyze, and generalize a stochastic strategy performed as an operator for a first-order gradient descent algorithm in order to increase the target accuracy and reduce time consumption.

DELMAR: Deep Linear Matrix Approximately Reconstruction to Extract Hierarchical Functional Connectivity in the Human Brain

no code implementations20 May 2022 Wei zhang, Yu Bao

Moreover, the theoretical analyses indicate that DELMAR can converge to the unique fixed point and even enable the accurate approximation of original input as DNNs.

DEMAND: Deep Matrix Approximately Nonlinear Decomposition to Identify Meta, Canonical, and Sub-Spatial Pattern of functional Magnetic Resonance Imaging in the Human Brain

no code implementations20 May 2022 Wei zhang, Yu Bao

At first, the proposed DEMAND employs a non-fully connected and multilayer-stacked architecture that is easier to be optimized compared with canonical DNNs; furthermore, due to the efficient architecture, training DEMAND can avoid overfitting and enables the recognition of individual/minor features based on a small dataset such as an individual data; finally, a novel rank estimator technique is introduced to tune all hyperparameters of DEMAND automatically.

Dictionary Learning

$\textit{latent}$-GLAT: Glancing at Latent Variables for Parallel Text Generation

1 code implementation5 Apr 2022 Yu Bao, Hao Zhou, ShuJian Huang, Dongqi Wang, Lihua Qian, Xinyu Dai, Jiajun Chen, Lei LI

Recently, parallel text generation has received widespread attention due to its success in generation efficiency.

Text Generation

Non-iterative Parallel Text Generation via Glancing Transformer

no code implementations1 Jan 2021 Lihua Qian, Hao Zhou, Yu Bao, Mingxuan Wang, Lin Qiu, Weinan Zhang, Yong Yu, Lei LI

Although non-autoregressive models with one-iteration generation achieves remarkable inference speed-up, they still falls behind their autoregressive counterparts inprediction accuracy.

Language Modelling Text Generation

Explicit Semantic Decomposition for Definition Generation

no code implementations ACL 2020 Jiahuan Li, Yu Bao, Shu-Jian Huang, Xin-yu Dai, Jia-Jun Chen

Definition generation, which aims to automatically generate dictionary definitions for words, has recently been proposed to assist the construction of dictionaries and help people understand unfamiliar texts.

Cannot find the paper you are looking for? You can Submit a new open access paper.