Search Results for author: Zhuoyuan Mao

Found 17 papers, 8 papers with code

Meta Ensemble for Japanese-Chinese Neural Machine Translation: Kyoto-U+ECNU Participation to WAT 2020

no code implementations AACL (WAT) 2020 Zhuoyuan Mao, Yibin Shen, Chenhui Chu, Sadao Kurohashi, Cheqing Jin

This paper describes the Japanese-Chinese Neural Machine Translation (NMT) system submitted by the joint team of Kyoto University and East China Normal University (Kyoto-U+ECNU) to WAT 2020 (Nakazawa et al., 2020).

Denoising Machine Translation +2

Tuning LLMs with Contrastive Alignment Instructions for Machine Translation in Unseen, Low-resource Languages

no code implementations11 Jan 2024 Zhuoyuan Mao, Yen Yu

This article introduces contrastive alignment instructions (AlignInstruct) to address two challenges in machine translation (MT) on large language models (LLMs).

Machine Translation Translation

Variable-length Neural Interlingua Representations for Zero-shot Neural Machine Translation

no code implementations17 May 2023 Zhuoyuan Mao, Haiyue Song, Raj Dabre, Chenhui Chu, Sadao Kurohashi

The language-independency of encoded representations within multilingual neural machine translation (MNMT) models is crucial for their generalization ability on zero-shot translation.

Machine Translation Translation

GPT-RE: In-context Learning for Relation Extraction using Large Language Models

1 code implementation3 May 2023 Zhen Wan, Fei Cheng, Zhuoyuan Mao, Qianying Liu, Haiyue Song, Jiwei Li, Sadao Kurohashi

In spite of the potential for ground-breaking achievements offered by large language models (LLMs) (e. g., GPT-3), they still lag significantly behind fully-supervised baselines (e. g., fine-tuned BERT) in relation extraction (RE).

In-Context Learning Relation +2

LEALLA: Learning Lightweight Language-agnostic Sentence Embeddings with Knowledge Distillation

no code implementations16 Feb 2023 Zhuoyuan Mao, Tetsuji Nakagawa

Large-scale language-agnostic sentence embedding models such as LaBSE (Feng et al., 2022) obtain state-of-the-art performance for parallel sentence alignment.

Knowledge Distillation Sentence +2

Textual Enhanced Contrastive Learning for Solving Math Word Problems

1 code implementation29 Nov 2022 Yibin Shen, Qianying Liu, Zhuoyuan Mao, Fei Cheng, Sadao Kurohashi

Solving math word problems is the task that analyses the relation of quantities and requires an accurate understanding of contextual natural language information.

Contrastive Learning Math

Seeking Diverse Reasoning Logic: Controlled Equation Expression Generation for Solving Math Word Problems

1 code implementation21 Sep 2022 Yibin Shen, Qianying Liu, Zhuoyuan Mao, Zhen Wan, Fei Cheng, Sadao Kurohashi

To solve Math Word Problems, human students leverage diverse reasoning logic that reaches different possible equation solutions.

Math

EMS: Efficient and Effective Massively Multilingual Sentence Representation Learning

1 code implementation31 May 2022 Zhuoyuan Mao, Chenhui Chu, Sadao Kurohashi

Massively multilingual sentence representation models, e. g., LASER, SBERT-distill, and LaBSE, help significantly improve cross-lingual downstream tasks.

Contrastive Learning Genre classification +4

Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision

no code implementations18 May 2022 Zhen Wan, Fei Cheng, Qianying Liu, Zhuoyuan Mao, Haiyue Song, Sadao Kurohashi

Contrastive pre-training on distant supervision has shown remarkable effectiveness in improving supervised relation extraction tasks.

Contrastive Learning Relation +1

When do Contrastive Word Alignments Improve Many-to-many Neural Machine Translation?

no code implementations Findings (NAACL) 2022 Zhuoyuan Mao, Chenhui Chu, Raj Dabre, Haiyue Song, Zhen Wan, Sadao Kurohashi

Meanwhile, the contrastive objective can implicitly utilize automatically learned word alignment, which has not been explored in many-to-many NMT.

Machine Translation NMT +4

Linguistically-driven Multi-task Pre-training for Low-resource Neural Machine Translation

1 code implementation20 Jan 2022 Zhuoyuan Mao, Chenhui Chu, Sadao Kurohashi

In the present study, we propose novel sequence-to-sequence pre-training objectives for low-resource machine translation (NMT): Japanese-specific sequence to sequence (JASS) for language pairs involving Japanese as the source or target language, and English-specific sequence to sequence (ENSS) for language pairs involving English.

Low-Resource Neural Machine Translation NMT +1

Lightweight Cross-Lingual Sentence Representation Learning

1 code implementation ACL 2021 Zhuoyuan Mao, Prakhar Gupta, Pei Wang, Chenhui Chu, Martin Jaggi, Sadao Kurohashi

Large-scale models for learning fixed-dimensional cross-lingual sentence representations like LASER (Artetxe and Schwenk, 2019b) lead to significant improvement in performance on downstream tasks.

Contrastive Learning Document Classification +4

JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine Translation

1 code implementation LREC 2020 Zhuoyuan Mao, Fabien Cromieres, Raj Dabre, Haiyue Song, Sadao Kurohashi

Monolingual pre-training approaches such as MASS (MAsked Sequence to Sequence) are extremely effective in boosting NMT quality for languages with small parallel corpora.

Machine Translation NMT +2

Pre-training via Leveraging Assisting Languages and Data Selection for Neural Machine Translation

no code implementations23 Jan 2020 Haiyue Song, Raj Dabre, Zhuoyuan Mao, Fei Cheng, Sadao Kurohashi, Eiichiro Sumita

To this end, we propose to exploit monolingual corpora of other languages to complement the scarcity of monolingual corpora for the LOI.

Machine Translation NMT +1

Cannot find the paper you are looking for? You can Submit a new open access paper.