Search Results for author: Zhixing Tan

Found 25 papers, 8 papers with code

Self-Supervised Quality Estimation for Machine Translation

no code implementations EMNLP 2021 Yuanhang Zheng, Zhixing Tan, Meng Zhang, Mieradilijiang Maimaiti, Huanbo Luan, Maosong Sun, Qun Liu, Yang Liu

Quality estimation (QE) of machine translation (MT) aims to evaluate the quality of machine-translated sentences without references and is important in practical applications of MT.

Machine Translation Sentence +1

MatPlotAgent: Method and Evaluation for LLM-Based Agentic Scientific Data Visualization

no code implementations18 Feb 2024 Zhiyu Yang, Zihan Zhou, Shuo Wang, Xin Cong, Xu Han, Yukun Yan, Zhenghao Liu, Zhixing Tan, Pengyuan Liu, Dong Yu, Zhiyuan Liu, Xiaodong Shi, Maosong Sun

Scientific data visualization plays a crucial role in research by enabling the direct display of complex information and assisting researchers in identifying implicit patterns.

Code Generation Data Visualization

Risk Taxonomy, Mitigation, and Assessment Benchmarks of Large Language Model Systems

no code implementations11 Jan 2024 Tianyu Cui, Yanling Wang, Chuanpu Fu, Yong Xiao, Sijia Li, Xinhao Deng, Yunpeng Liu, Qinglin Zhang, Ziyi Qiu, Peiyang Li, Zhixing Tan, Junwu Xiong, Xinyu Kong, Zujie Wen, Ke Xu, Qi Li

Based on this, we propose a comprehensive taxonomy, which systematically analyzes potential risks associated with each module of an LLM system and discusses the corresponding mitigation strategies.

Language Modelling Large Language Model

Privacy-Preserving Prompt Tuning for Large Language Model Services

no code implementations10 May 2023 Yansong Li, Zhixing Tan, Yang Liu

Based on prompt tuning, we propose Privacy-Preserving Prompt Tuning (RAPT), a framework that provides privacy guarantees for LLM services.

Language Modelling Large Language Model +1

Black-box Prompt Tuning with Subspace Learning

no code implementations4 May 2023 Yuanhang Zheng, Zhixing Tan, Peng Li, Yang Liu

Black-box prompt tuning uses derivative-free optimization algorithms to learn prompts in low-dimensional subspaces instead of back-propagating through the network of Large Language Models (LLMs).

Meta-Learning

A Template-based Method for Constrained Neural Machine Translation

1 code implementation23 May 2022 Shuo Wang, Peng Li, Zhixing Tan, Zhaopeng Tu, Maosong Sun, Yang Liu

In this work, we propose a template-based method that can yield results with high translation quality and match accuracy and the inference speed of our method is comparable with unconstrained NMT models.

Machine Translation NMT +1

MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better Translators

1 code implementation ACL 2022 Zhixing Tan, Xiangwen Zhang, Shuo Wang, Yang Liu

Prompting has recently been shown as a promising approach for applying pre-trained language models to perform downstream tasks.

Machine Translation Translation

Language Models are Good Translators

no code implementations25 Jun 2021 Shuo Wang, Zhaopeng Tu, Zhixing Tan, Wenxuan Wang, Maosong Sun, Yang Liu

Inspired by the recent progress of large-scale pre-trained language models on machine translation in a limited scenario, we firstly demonstrate that a single language model (LM4MT) can achieve comparable performance with strong encoder-decoder NMT models on standard machine translation benchmarks, using the same training data and similar amount of model parameters.

Language Modelling Machine Translation +2

CPM-2: Large-scale Cost-effective Pre-trained Language Models

2 code implementations20 Jun 2021 Zhengyan Zhang, Yuxian Gu, Xu Han, Shengqi Chen, Chaojun Xiao, Zhenbo Sun, Yuan YAO, Fanchao Qi, Jian Guan, Pei Ke, Yanzheng Cai, Guoyang Zeng, Zhixing Tan, Zhiyuan Liu, Minlie Huang, Wentao Han, Yang Liu, Xiaoyan Zhu, Maosong Sun

We present a suite of cost-effective techniques for the use of PLMs to deal with the efficiency issues of pre-training, fine-tuning, and inference.

On the Language Coverage Bias for Neural Machine Translation

no code implementations Findings (ACL) 2021 Shuo Wang, Zhaopeng Tu, Zhixing Tan, Shuming Shi, Maosong Sun, Yang Liu

Language coverage bias, which indicates the content-dependent differences between sentence pairs originating from the source and target languages, is important for neural machine translation (NMT) because the target-original training data is not well exploited in current practice.

Data Augmentation Machine Translation +3

Dynamic Multi-Branch Layers for On-Device Neural Machine Translation

1 code implementation14 May 2021 Zhixing Tan, Zeyuan Yang, Meng Zhang, Qun Liu, Maosong Sun, Yang Liu

With the rapid development of artificial intelligence (AI), there is a trend in moving AI applications, such as neural machine translation (NMT), from cloud to mobile devices.

Machine Translation NMT +1

Neural Machine Translation: A Review of Methods, Resources, and Tools

no code implementations31 Dec 2020 Zhixing Tan, Shuo Wang, Zonghan Yang, Gang Chen, Xuancheng Huang, Maosong Sun, Yang Liu

Machine translation (MT) is an important sub-field of natural language processing that aims to translate natural languages using computers.

Data Augmentation Machine Translation +2

Modeling Voting for System Combination in Machine Translation

1 code implementation14 Jul 2020 Xuancheng Huang, Jiacheng Zhang, Zhixing Tan, Derek F. Wong, Huanbo Luan, Jingfang Xu, Maosong Sun, Yang Liu

System combination is an important technique for combining the hypotheses of different machine translation systems to improve translation performance.

Machine Translation Translation

Towards Linear Time Neural Machine Translation with Capsule Networks

no code implementations IJCNLP 2019 Mingxuan Wang, Jun Xie, Zhixing Tan, Jinsong Su, Deyi Xiong, Lei LI

In this study, we first investigate a novel capsule network with dynamic routing for linear time Neural Machine Translation (NMT), referred as \textsc{CapsNMT}.

Machine Translation NMT +2

Deep Semantic Role Labeling with Self-Attention

1 code implementation5 Dec 2017 Zhixing Tan, Mingxuan Wang, Jun Xie, Yidong Chen, Xiaodong Shi

Semantic Role Labeling (SRL) is believed to be a crucial step towards natural language understanding and has been widely studied.

Natural Language Understanding Semantic Role Labeling

XMU Neural Machine Translation Systems for WAT 2017

no code implementations WS 2017 Boli Wang, Zhixing Tan, Jinming Hu, Yidong Chen, Xiaodong Shi

This paper describes the Neural Machine Translation systems of Xiamen University for the shared translation tasks of WAT 2017.

Machine Translation Translation +1

Lattice-Based Recurrent Neural Network Encoders for Neural Machine Translation

no code implementations25 Sep 2016 Jinsong Su, Zhixing Tan, Deyi Xiong, Rongrong Ji, Xiaodong Shi, Yang Liu

Neural machine translation (NMT) heavily relies on word-level modelling to learn semantic representations of input sentences.

Machine Translation NMT +2

Cannot find the paper you are looking for? You can Submit a new open access paper.