Search Results for author: Liangyou Li

Found 35 papers, 1 papers with code

Multilingual Speech Translation with Unified Transformer: Huawei Noah’s Ark Lab at IWSLT 2021

no code implementations ACL (IWSLT) 2021 Xingshan Zeng, Liangyou Li, Qun Liu

We use a unified transformer architecture for our MultiST model, so that the data from different modalities (i. e., speech and text) and different tasks (i. e., Speech Recognition, Machine Translation, and Speech Translation) can be exploited to enhance the model’s ability.

Data Augmentation Machine Translation +3

Triangular Transfer: Freezing the Pivot for Triangular Machine Translation

no code implementations ACL 2022 Meng Zhang, Liangyou Li, Qun Liu

Triangular machine translation is a special case of low-resource machine translation where the language pair of interest has limited parallel data, but both languages have abundant parallel data with a pivot language.

Language Modelling Machine Translation +2

Adversarial Parameter Defense by Multi-Step Risk Minimization

no code implementations7 Sep 2021 Zhiyuan Zhang, Ruixuan Luo, Xuancheng Ren, Qi Su, Liangyou Li, Xu sun

To enhance neural networks, we propose the adversarial parameter defense algorithm that minimizes the average risk of multiple adversarial parameter corruptions.

Uncertainty-Aware Balancing for Multilingual and Multi-Domain Neural Machine Translation Training

no code implementations EMNLP 2021 Minghao Wu, Yitong Li, Meng Zhang, Liangyou Li, Gholamreza Haffari, Qun Liu

In this work, we propose an approach, MultiUAT, that dynamically adjusts the training data usage based on the model's uncertainty on a small set of trusted clean data for multi-corpus machine translation.

Machine Translation Translation

RealTranS: End-to-End Simultaneous Speech Translation with Convolutional Weighted-Shrinking Transformer

no code implementations Findings (ACL) 2021 Xingshan Zeng, Liangyou Li, Qun Liu

To bridge the modality gap between speech and text, RealTranS gradually downsamples the input speech with interleaved convolution and unidirectional Transformer layers for acoustic modeling, and then maps speech features into text space with a weighted-shrinking operation and a semantic encoder.

Translation

Learning Multilingual Representation for Natural Language Understanding with Enhanced Cross-Lingual Supervision

no code implementations9 Jun 2021 Yinpeng Guo, Liangyou Li, Xin Jiang, Qun Liu

Recently, pre-training multilingual language models has shown great potential in learning multilingual representation, a crucial topic of natural language processing.

Natural Language Understanding

Multilingual Speech Translation with Unified Transformer: Huawei Noah's Ark Lab at IWSLT 2021

no code implementations1 Jun 2021 Xingshan Zeng, Liangyou Li, Qun Liu

We use a unified transformer architecture for our MultiST model, so that the data from different modalities (i. e., speech and text) and different tasks (i. e., Speech Recognition, Machine Translation, and Speech Translation) can be exploited to enhance the model's ability.

Data Augmentation Machine Translation +3

An Approach to Improve Robustness of NLP Systems against ASR Errors

no code implementations25 Mar 2021 Tong Cui, Jinghui Xiao, Liangyou Li, Xin Jiang, Qun Liu

Speech-enabled systems typically first convert audio to text through an automatic speech recognition (ASR) model and then feed the text to downstream natural language processing (NLP) modules.

Automatic Speech Recognition Data Augmentation +2

Dependency Graph-to-String Statistical Machine Translation

no code implementations20 Mar 2021 Liangyou Li, Andy Way, Qun Liu

We present graph-based translation models which translate source graphs into target strings.

Machine Translation Translation

Future-Guided Incremental Transformer for Simultaneous Translation

no code implementations23 Dec 2020 Shaolei Zhang, Yang Feng, Liangyou Li

Simultaneous translation (ST) starts translations synchronously while reading source sentences, and is used in many online scenarios.

Knowledge Distillation Translation

Document Graph for Neural Machine Translation

no code implementations EMNLP 2021 Mingzhou Xu, Liangyou Li, Derek. F. Wong, Qun Liu, Lidia S. Chao

Previous works have shown that contextual information can improve the performance of neural machine translation (NMT).

Machine Translation Translation

Exploring the Vulnerability of Deep Neural Networks: A Study of Parameter Corruption

no code implementations10 Jun 2020 Xu Sun, Zhiyuan Zhang, Xuancheng Ren, Ruixuan Luo, Liangyou Li

We argue that the vulnerability of model parameters is of crucial value to the study of model robustness and generalization but little research has been devoted to understanding this matter.

Pretrained Language Models for Document-Level Neural Machine Translation

no code implementations8 Nov 2019 Liangyou Li, Xin Jiang, Qun Liu

Previous work on document-level NMT usually focuses on limited contexts because of degraded performance on larger contexts.

Machine Translation Pretrained Language Models +1

A General Framework for Adaptation of Neural Machine Translation to Simultaneous Translation

no code implementations Asian Chapter of the Association for Computational Linguistics 2020 Yun Chen, Liangyou Li, Xin Jiang, Xiao Chen, Qun Liu

Despite the success of neural machine translation (NMT), simultaneous neural machine translation (SNMT), the task of translating in real time before a full sentence has been observed, remains challenging due to the syntactic structure difference and simultaneity requirements.

Machine Translation Translation

Huawei's NMT Systems for the WMT 2019 Biomedical Translation Task

no code implementations WS 2019 Wei Peng, Jianfeng Liu, Liangyou Li, Qun Liu

This paper describes Huawei{'}s neural machine translation systems for the WMT 2019 biomedical translation shared task.

Domain Adaptation Machine Translation +2

Semantics-Enhanced Task-Oriented Dialogue Translation: A Case Study on Hotel Booking

no code implementations IJCNLP 2017 Long-Yue Wang, Jinhua Du, Liangyou Li, Zhaopeng Tu, Andy Way, Qun Liu

We showcase TODAY, a semantics-enhanced task-oriented dialogue translation system, whose novelties are: (i) task-oriented named entity (NE) definition and a hybrid strategy for NE recognition and translation; and (ii) a novel grounded semantic method for dialogue understanding and task-order management.

Dialogue Understanding Machine Translation +2

Context-Aware Graph Segmentation for Graph-Based Translation

no code implementations EACL 2017 Liangyou Li, Andy Way, Qun Liu

In this paper, we present an improved graph-based translation model which segments an input graph into node-induced subgraphs by taking source context into consideration.

Translation

Topic-Informed Neural Machine Translation

no code implementations COLING 2016 Jian Zhang, Liangyou Li, Andy Way, Qun Liu

In recent years, neural machine translation (NMT) has demonstrated state-of-the-art machine translation (MT) performance.

Machine Translation Topic Models +1

Cannot find the paper you are looking for? You can Submit a new open access paper.