Search Results for author: Zhixing Tan

Found 20 papers, 7 papers with code

Self-Supervised Quality Estimation for Machine Translation

no code implementations EMNLP 2021 Yuanhang Zheng, Zhixing Tan, Meng Zhang, Mieradilijiang Maimaiti, Huanbo Luan, Maosong Sun, Qun Liu, Yang Liu

Quality estimation (QE) of machine translation (MT) aims to evaluate the quality of machine-translated sentences without references and is important in practical applications of MT.

Machine Translation Translation

MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better Translators

1 code implementation ACL 2022 Zhixing Tan, Xiangwen Zhang, Shuo Wang, Yang Liu

Prompting has recently been shown as a promising approach for applying pre-trained language models to perform downstream tasks.

Machine Translation Translation

Graph Piece: Efficiently Generating High-Quality Molecular Graphs with Substructures

1 code implementation29 Jun 2021 Xiangzhe Kong, Zhixing Tan, Yang Liu

Molecule generation, which requires generating valid molecules with desired properties, is a fundamental but challenging task.

Drug Discovery Graph Generation +1

Language Models are Good Translators

no code implementations25 Jun 2021 Shuo Wang, Zhaopeng Tu, Zhixing Tan, Wenxuan Wang, Maosong Sun, Yang Liu

Inspired by the recent progress of large-scale pre-trained language models on machine translation in a limited scenario, we firstly demonstrate that a single language model (LM4MT) can achieve comparable performance with strong encoder-decoder NMT models on standard machine translation benchmarks, using the same training data and similar amount of model parameters.

Language Modelling Machine Translation +1

CPM-2: Large-scale Cost-effective Pre-trained Language Models

2 code implementations20 Jun 2021 Zhengyan Zhang, Yuxian Gu, Xu Han, Shengqi Chen, Chaojun Xiao, Zhenbo Sun, Yuan YAO, Fanchao Qi, Jian Guan, Pei Ke, Yanzheng Cai, Guoyang Zeng, Zhixing Tan, Zhiyuan Liu, Minlie Huang, Wentao Han, Yang Liu, Xiaoyan Zhu, Maosong Sun

We present a suite of cost-effective techniques for the use of PLMs to deal with the efficiency issues of pre-training, fine-tuning, and inference.

On the Language Coverage Bias for Neural Machine Translation

no code implementations Findings (ACL) 2021 Shuo Wang, Zhaopeng Tu, Zhixing Tan, Shuming Shi, Maosong Sun, Yang Liu

Language coverage bias, which indicates the content-dependent differences between sentence pairs originating from the source and target languages, is important for neural machine translation (NMT) because the target-original training data is not well exploited in current practice.

Data Augmentation Machine Translation +1

Dynamic Multi-Branch Layers for On-Device Neural Machine Translation

1 code implementation14 May 2021 Zhixing Tan, Zeyuan Yang, Meng Zhang, Qun Liu, Maosong Sun, Yang Liu

With the rapid development of artificial intelligence (AI), there is a trend in moving AI applications, such as neural machine translation (NMT), from cloud to mobile devices.

Machine Translation Translation

Neural Machine Translation: A Review of Methods, Resources, and Tools

no code implementations31 Dec 2020 Zhixing Tan, Shuo Wang, Zonghan Yang, Gang Chen, Xuancheng Huang, Maosong Sun, Yang Liu

Machine translation (MT) is an important sub-field of natural language processing that aims to translate natural languages using computers.

Data Augmentation Machine Translation +1

Modeling Voting for System Combination in Machine Translation

1 code implementation14 Jul 2020 Xuancheng Huang, Jiacheng Zhang, Zhixing Tan, Derek F. Wong, Huanbo Luan, Jingfang Xu, Maosong Sun, Yang Liu

System combination is an important technique for combining the hypotheses of different machine translation systems to improve translation performance.

Machine Translation Translation

Towards Linear Time Neural Machine Translation with Capsule Networks

no code implementations IJCNLP 2019 Mingxuan Wang, Jun Xie, Zhixing Tan, Jinsong Su, Deyi Xiong, Lei LI

In this study, we first investigate a novel capsule network with dynamic routing for linear time Neural Machine Translation (NMT), referred as \textsc{CapsNMT}.

Machine Translation Translation

Deep Semantic Role Labeling with Self-Attention

1 code implementation5 Dec 2017 Zhixing Tan, Mingxuan Wang, Jun Xie, Yidong Chen, Xiaodong Shi

Semantic Role Labeling (SRL) is believed to be a crucial step towards natural language understanding and has been widely studied.

Natural Language Understanding Semantic Role Labeling

XMU Neural Machine Translation Systems for WAT 2017

no code implementations WS 2017 Boli Wang, Zhixing Tan, Jinming Hu, Yidong Chen, Xiaodong Shi

This paper describes the Neural Machine Translation systems of Xiamen University for the shared translation tasks of WAT 2017.

Machine Translation Translation +1

Lattice-Based Recurrent Neural Network Encoders for Neural Machine Translation

no code implementations25 Sep 2016 Jinsong Su, Zhixing Tan, Deyi Xiong, Rongrong Ji, Xiaodong Shi, Yang Liu

Neural machine translation (NMT) heavily relies on word-level modelling to learn semantic representations of input sentences.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.