Search Results for author: Cheonbok Park

Found 6 papers, 1 papers with code

DaLC: Domain Adaptation Learning Curve Prediction for Neural Machine Translation

no code implementations Findings (ACL) 2022 Cheonbok Park, Hantae Kim, Ioan Calapodescu, Hyunchang Cho, Vassilina Nikoulina

Domain Adaptation (DA) of Neural Machine Translation (NMT) model often relies on a pre-trained general NMT model which is adapted to the new domain on a sample of in-domain parallel data.

Domain Adaptation Machine Translation +1

Reversible Instance Normalization for Accurate Time-Series Forecasting against Distribution Shift

no code implementations ICLR 2022 Taesung Kim, Jinhee Kim, Yunwon Tae, Cheonbok Park, Jang-Ho Choi, Jaegul Choo

The former normalizes the input to fix its distribution in terms of the mean and variance, while the latter returns the output to the original distribution.

Time Series Time Series Forecasting

An Empirical Experiment on Deep Learning Models for Predicting Traffic Data

no code implementations12 May 2021 Hyunwook Lee, Cheonbok Park, Seungmin Jin, Hyeshin Chu, Jaegul Choo, Sungahn Ko

For example, it is difficult to figure out which models provide state-of-the-art performance, as recently proposed models have often been evaluated with different datasets and experiment environments.

Unsupervised Neural Machine Translation for Low-Resource Domains via Meta-Learning

no code implementations ACL 2021 Cheonbok Park, Yunwon Tae, Taehee Kim, Soyoung Yang, Mohammad Azam Khan, Eunjeong Park, Jaegul Choo

To address this issue, this paper presents a novel meta-learning algorithm for unsupervised neural machine translation (UNMT) that trains the model to adapt to another domain by utilizing only a small amount of training data.

Meta-Learning Transfer Learning +2

ST-GRAT: A Novel Spatio-temporal Graph Attention Network for Accurately Forecasting Dynamically Changing Road Speed

1 code implementation29 Nov 2019 Cheonbok Park, Chunggi Lee, Hyojin Bahng, Yunwon Tae, Kihwan Kim, Seungmin Jin, Sungahn Ko, Jaegul Choo

Predicting road traffic speed is a challenging task due to different types of roads, abrupt speed change and spatial dependencies between roads; it requires the modeling of dynamically changing spatial dependencies among roads and temporal patterns over long input sequences.

Graph Attention

SANVis: Visual Analytics for Understanding Self-Attention Networks

no code implementations13 Sep 2019 Cheonbok Park, Inyoup Na, Yongjang Jo, Sungbok Shin, Jaehyo Yoo, Bum Chul Kwon, Jian Zhao, Hyungjong Noh, Yeonsoo Lee, Jaegul Choo

Attention networks, a deep neural network architecture inspired by humans' attention mechanism, have seen significant success in image captioning, machine translation, and many other applications.

Image Captioning Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.