Search Results for author: Ying Qin

Found 27 papers, 2 papers with code

Neighbors Are Not Strangers: Improving Non-Autoregressive Translation under Low-Frequency Lexical Constraints

1 code implementation NAACL 2022 Chun Zeng, Jiangjie Chen, Tianyi Zhuang, Rui Xu, Hao Yang, Ying Qin, Shimin Tao, Yanghua Xiao

To this end, we propose a plug-in algorithm for this line of work, i. e., Aligned Constrained Training (ACT), which alleviates this problem by familiarizing the model with the source-side context of the constraints.

Translation

Explicitly Increasing Input Information Density for Vision Transformers on Small Datasets

1 code implementation25 Oct 2022 Xiangyu Chen, Ying Qin, Wenju Xu, Andrés M. Bur, Cuncong Zhong, Guanghui Wang

To boost the performance of vision Transformers on small datasets, this paper proposes to explicitly increase the input information density in the frequency domain.

Applying the Information Bottleneck Principle to Prosodic Representation Learning

no code implementations5 Aug 2021 Guangyan Zhang, Ying Qin, Daxin Tan, Tan Lee

This paper describes a novel design of a neural network-based speech generation model for learning prosodic representation. The problem of representation learning is formulated according to the information bottleneck (IB) principle.

Representation Learning

A study on the efficacy of model pre-training in developing neural text-to-speech system

no code implementations8 Oct 2021 Guangyan Zhang, Yichong Leng, Daxin Tan, Ying Qin, Kaitao Song, Xu Tan, Sheng Zhao, Tan Lee

However, in terms of ultimately achieved system performance for target speaker(s), the actual benefits of model pre-training are uncertain and unstable, depending very much on the quantity and text content of training data.

Computational Efficiency

Efficient Transfer Learning for Quality Estimation with Bottleneck Adapter Layer

no code implementations EAMT 2020 Hao Yang, Minghan Wang, Ning Xie, Ying Qin, Yao Deng

Compared with the commonly used NuQE baseline, BAL-QE achieves 47% (En-Ru) and 75% (En-De) of performance promotions.

NMT Transfer Learning

HW-TSC’s Submissions to the WMT21 Biomedical Translation Task

no code implementations WMT (EMNLP) 2021 Hao Yang, Zhanglin Wu, Zhengzhe Yu, Xiaoyu Chen, Daimeng Wei, Zongyao Li, Hengchao Shang, Minghan Wang, Jiaxin Guo, Lizhi Lei, Chuanfei Xu, Min Zhang, Ying Qin

This paper describes the submission of Huawei Translation Service Center (HW-TSC) to WMT21 biomedical translation task in two language pairs: Chinese↔English and German↔English (Our registered team name is HuaweiTSC).

Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.