Search Results for author: Haipeng Sun

Found 14 papers, 4 papers with code

OPERA: Operation-Pivoted Discrete Reasoning over Text

1 code implementation NAACL 2022 Yongwei Zhou, Junwei Bao, Chaoqun Duan, Haipeng Sun, Jiahui Liang, Yifan Wang, Jing Zhao, Youzheng Wu, Xiaodong He, Tiejun Zhao

To inherit the advantages of these two types of methods, we propose OPERA, an operation-pivoted discrete reasoning framework, where lightweight symbolic operations (compared with logical forms) as neural modules are utilized to facilitate the reasoning ability and interpretability.

Machine Reading Comprehension Semantic Parsing

MoNET: Tackle State Momentum via Noise-Enhanced Training for Dialogue State Tracking

no code implementations10 Nov 2022 Haoning Zhang, Junwei Bao, Haipeng Sun, Youzheng Wu, Wenye Li, Shuguang Cui, Xiaodong He

Then, the noised previous state is used as the input to learn to predict the current state, improving the model's ability to update and correct slot values.

Dialogue State Tracking

Mars: Modeling Context & State Representations with Contrastive Learning for End-to-End Task-Oriented Dialog

1 code implementation17 Oct 2022 Haipeng Sun, Junwei Bao, Youzheng Wu, Xiaodong He

Traditional end-to-end task-oriented dialog systems first convert dialog context into belief state and action state before generating the system response.

Contrastive Learning

CSS: Combining Self-training and Self-supervised Learning for Few-shot Dialogue State Tracking

no code implementations11 Oct 2022 Haoning Zhang, Junwei Bao, Haipeng Sun, Huaishao Luo, Wenye Li, Shuguang Cui

The unlabeled data of the DST task is incorporated into the self-training iterations, where the pseudo labels are predicted by a DST model trained on limited labeled data in advance.

Dialogue State Tracking Machine Reading Comprehension +2

BORT: Back and Denoising Reconstruction for End-to-End Task-Oriented Dialog

1 code implementation Findings (NAACL) 2022 Haipeng Sun, Junwei Bao, Youzheng Wu, Xiaodong He

To enhance the denoising capability of the model to reduce the impact of error propagation, denoising reconstruction is used to reconstruct the corrupted dialog state and response.

Denoising

OPERA:Operation-Pivoted Discrete Reasoning over Text

no code implementations29 Apr 2022 Yongwei Zhou, Junwei Bao, Chaoqun Duan, Haipeng Sun, Jiahui Liang, Yifan Wang, Jing Zhao, Youzheng Wu, Xiaodong He, Tiejun Zhao

To inherit the advantages of these two types of methods, we propose OPERA, an operation-pivoted discrete reasoning framework, where lightweight symbolic operations (compared with logical forms) as neural modules are utilized to facilitate the reasoning ability and interpretability.

Machine Reading Comprehension Semantic Parsing

Self-Training for Unsupervised Neural Machine Translation in Unbalanced Training Data Scenarios

no code implementations NAACL 2021 Haipeng Sun, Rui Wang, Kehai Chen, Masao Utiyama, Eiichiro Sumita, Tiejun Zhao

Unsupervised neural machine translation (UNMT) that relies solely on massive monolingual corpora has achieved remarkable results in several translation tasks.

Machine Translation Translation

English-Myanmar Supervised and Unsupervised NMT: NICT's Machine Translation Systems at WAT-2019

no code implementations WS 2019 Rui Wang, Haipeng Sun, Kehai Chen, Chenchen Ding, Masao Utiyama, Eiichiro Sumita

This paper presents the NICT{'}s participation (team ID: NICT) in the 6th Workshop on Asian Translation (WAT-2019) shared translation task, specifically Myanmar (Burmese) - English task in both translation directions.

Language Modelling Machine Translation +2

Revisiting Simple Domain Adaptation Methods in Unsupervised Neural Machine Translation

no code implementations26 Aug 2019 Haipeng Sun, Rui Wang, Kehai Chen, Masao Utiyama, Eiichiro Sumita, Tiejun Zhao, Chenhui Chu

However, it has not been well-studied for unsupervised neural machine translation (UNMT), although UNMT has recently achieved remarkable results in several domain-specific language pairs.

Domain Adaptation Machine Translation +1

Unsupervised Bilingual Word Embedding Agreement for Unsupervised Neural Machine Translation

no code implementations ACL 2019 Haipeng Sun, Rui Wang, Kehai Chen, Masao Utiyama, Eiichiro Sumita, Tiejun Zhao

In previous methods, UBWE is first trained using non-parallel monolingual corpora and then this pre-trained UBWE is used to initialize the word embedding in the encoder and decoder of UNMT.

Denoising Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.