Although non-autoregressive models with one-iteration generation achieves remarkable inference speed-up, they still falls behind their autoregressive counterparts inprediction accuracy.
With GLM, we develop Glancing Transformer (GLAT) for machine translation.
Definition generation, which aims to automatically generate dictionary definitions for words, has recently been proposed to assist the construction of dictionaries and help people understand unfamiliar texts.
Non-autoregressive models are promising on various text generation tasks.
In this paper, we propose to generate sentences from disentangled syntactic and semantic spaces.