Search Results for author: Baikjin Jung

Found 8 papers, 2 papers with code

POSTECH-ETRI’s Submission to the WMT2020 APE Shared Task: Automatic Post-Editing with Cross-lingual Language Model

no code implementations WMT (EMNLP) 2020 Jihyung Lee, WonKee Lee, Jaehun Shin, Baikjin Jung, Young-Kil Kim, Jong-Hyeok Lee

This paper describes POSTECH-ETRI’s submission to WMT2020 for the shared task on automatic post-editing (APE) for 2 language pairs: English-German (En-De) and English-Chinese (En-Zh).

Automatic Post-Editing Language Modelling +2

Noising Scheme for Data Augmentation in Automatic Post-Editing

no code implementations WMT (EMNLP) 2020 WonKee Lee, Jaehun Shin, Baikjin Jung, Jihyung Lee, Jong-Hyeok Lee

In our experiment, we implemented a noising module that simulates four types of post-editing errors, and we introduced this module into a Transformer-based multi-source APE model.

Automatic Post-Editing Data Augmentation +1

Quality Estimation Using Dual Encoders with Transfer Learning

no code implementations WMT (EMNLP) 2021 Dam Heo, WonKee Lee, Baikjin Jung, Jong-Hyeok Lee

This paper describes POSTECH’s quality estimation systems submitted to Task 2 of the WMT 2021 quality estimation shared task: Word and Sentence-Level Post-editing Effort.

Machine Translation Sentence +3

Denoising Table-Text Retrieval for Open-Domain Question Answering

1 code implementation26 Mar 2024 Deokhyung Kang, Baikjin Jung, Yunsu Kim, Gary Geunbae Lee

Previous studies in table-text open-domain question answering have two common challenges: firstly, their retrievers can be affected by false-positive labels in training datasets; secondly, they may struggle to provide appropriate evidence for questions that require reasoning across the table.

Denoising Open-Domain Question Answering +2

Towards Semi-Supervised Learning of Automatic Post-Editing: Data-Synthesis by Infilling Mask with Erroneous Tokens

no code implementations8 Apr 2022 WonKee Lee, Seong-Hwan Heo, Baikjin Jung, Jong-Hyeok Lee

Semi-supervised learning that leverages synthetic training data has been widely adopted in the field of Automatic post-editing (APE) to overcome the lack of human-annotated training data.

Automatic Post-Editing Language Modelling +1

Cannot find the paper you are looking for? You can Submit a new open access paper.