Search Results for author: Heeyoul Choi

Found 26 papers, 1 papers with code

Advancing AI with Integrity: Ethical Challenges and Solutions in Neural Machine Translation

no code implementations1 Apr 2024 Richard Kimera, Yun-Seon Kim, Heeyoul Choi

Additionally, the paper probes the distribution of responsibility between AI systems and humans, underscoring the essential role of human oversight in upholding NMT ethical standards.

Fairness Machine Translation +2

Fast Training of NMT Model with Data Sorting

no code implementations16 Aug 2023 Daniela N. Rim, Kimera Richard, Heeyoul Choi

The Transformer model has revolutionized Natural Language Processing tasks such as Neural Machine Translation, and many efforts have been made to study the Transformer architecture, which increased its efficiency and accuracy.

Machine Translation NMT +2

Shared Latent Space by Both Languages in Non-Autoregressive Neural Machine Translation

no code implementations2 May 2023 DongNyeong Heo, Heeyoul Choi

Latent variable modeling in non-autoregressive neural machine translation (NAT) is a promising approach to mitigate the multimodality problem.

Machine Translation Translation

Graph Attention Multi-Agent Fleet Autonomy for Advanced Air Mobility

no code implementations14 Feb 2023 Malintha Fernando, Ransalu Senanayake, Heeyoul Choi, Martin Swany

Autonomous mobility is emerging as a new disruptive mode of urban transportation for moving cargo and passengers.

Decision Making Graph Attention +1

Advanced Scaling Methods for VNF deployment with Reinforcement Learning

no code implementations19 Jan 2023 Namjin Seo, DongNyeong Heo, Heeyoul Choi

Network function virtualization (NFV) and software-defined network (SDN) have become emerging network paradigms, allowing virtualized network function (VNF) deployment at a low cost.

reinforcement-learning Reinforcement Learning (RL)

Building a Parallel Corpus and Training Translation Models Between Luganda and English

no code implementations7 Jan 2023 Richard Kimera, Daniela N. Rim, Heeyoul Choi

Neural machine translation (NMT) has achieved great successes with large datasets, so NMT is more premised on high-resource languages.

Machine Translation NMT +1

Partitioning Image Representation in Contrastive Learning

no code implementations20 Mar 2022 Hyunsub Lee, Heeyoul Choi

In this paper, we introduce a new representation, partitioned representation, which can learn both common and unique features of the anchor and positive samples in contrastive learning.

Contrastive Learning Data Augmentation +1

End-to-End Training for Back-Translation with Categorical Reparameterization Trick

no code implementations17 Feb 2022 DongNyeong Heo, Heeyoul Choi

Back-translation is an effective semi-supervised learning framework in neural machine translation (NMT).

Machine Translation NMT +2

Systematic Review for AI-based Language Learning Tools

no code implementations29 Oct 2021 Jin Ha Woo, Heeyoul Choi

The Second Language Acquisition field has been significantly impacted by a greater emphasis on individualized learning and rapid developments in artificial intelligence (AI).

Language Acquisition

Sequential Deep Learning Architectures for Anomaly Detection in Virtual Network Function Chains

no code implementations29 Sep 2021 Chungjun Lee, Jibum Hong, DongNyeong Heo, Heeyoul Choi

Therefore, we propose several sequential deep learning models to learn time-series patterns and sequential patterns of the virtual network functions (VNFs) in the chain with variable lengths.

Anomaly Detection Time Series +1

Adversarial Training with Contrastive Learning in NLP

no code implementations19 Sep 2021 Daniela N. Rim, DongNyeong Heo, Heeyoul Choi

In this work, we propose adversarial training with contrastive learning (ATCL) to adversarially train a language processing task using the benefits of contrastive learning.

Contrastive Learning Language Modelling +3

Deep Neural Networks and End-to-End Learning for Audio Compression

no code implementations25 May 2021 Daniela N. Rim, Inseon Jang, Heeyoul Choi

We apply a reparametrization trick for the Bernoulli distribution for the discrete representations, which allows smooth backpropagation.

Audio Compression

Reinforcement Learning of Graph Neural Networks for Service Function Chaining

no code implementations17 Nov 2020 DongNyeong Heo, Doyoung Lee, Hee-Gon Kim, Suhyun Park, Heeyoul Choi

In the management of computer network systems, the service function chaining (SFC) modules play an important role by generating efficient paths for network traffic through physical servers with virtualized network functions (VNF).

Management reinforcement-learning +2

Graph Neural Network based Service Function Chaining for Automatic Network Control

no code implementations11 Sep 2020 DongNyeong Heo, Stanislav Lange, Hee-Gon Kim, Heeyoul Choi

Moreover, the GNN based model can be applied to a new network topology without re-designing and re-training.

Active Search for Nearest Neighbors

no code implementations1 Dec 2019 Hayoung Um, Heeyoul Choi

In pattern recognition or machine learning, it is a very fundamental task to find nearest neighbors of a given point.

Network Intrusion Detection based on LSTM and Feature Embedding

no code implementations26 Nov 2019 Hyeokmin Gwon, Chungjun Lee, Rakun Keum, Heeyoul Choi

Growing number of network devices and services have led to increasing demand for protective measures as hackers launch attacks to paralyze or steal information from victim systems.

Binary Classification Network Intrusion Detection +2

Self-Knowledge Distillation in Natural Language Processing

no code implementations RANLP 2019 Sangchul Hahn, Heeyoul Choi

Since deep learning became a key player in natural language processing (NLP), many deep learning models have been showing remarkable performances in a variety of NLP tasks, and in some cases, they are even outperforming humans.

Language Modelling Machine Translation +2

Alpha-Integration Pooling for Convolutional Neural Networks

no code implementations8 Nov 2018 Hayoung Eom, Heeyoul Choi

$\alpha$I-pooling is a general pooling method including max-pooling and arithmetic average-pooling as a special case, depending on the parameter $\alpha$.

Philosophy

Disentangling Latent Factors of Variational Auto-Encoder with Whitening

no code implementations8 Nov 2018 Sangchul Hahn, Heeyoul Choi

The proposed method is applied to the latent variables of variational auto-encoder (VAE), although it can be applied to any generative models with latent variables.

Image Generation

Gradient Acceleration in Activation Functions

no code implementations27 Sep 2018 Sangchul Hahn, Heeyoul Choi

Based on this explanation, we propose a new technique for activation functions, {\em gradient acceleration in activation function (GAAF)}, that accelerates gradients to flow even in the saturation area.

Understanding Dropout as an Optimization Trick

no code implementations26 Jun 2018 Sangchul Hahn, Heeyoul Choi

Based on this explanation, we propose a new technique for activation functions, {\em gradient acceleration in activation function (GAAF)}, that accelerates gradients to flow even in the saturation area.

Image Classification

Persistent Hidden States and Nonlinear Transformation for Long Short-Term Memory

no code implementations22 Jun 2018 Heeyoul Choi

LSTM transforms the input and the previous hidden states to the next states with the affine transformation, multiplication operations and a nonlinear activation function, which makes a good data representation for a given task.

Machine Translation speech-recognition +1

Fine-Grained Attention Mechanism for Neural Machine Translation

no code implementations30 Mar 2018 Heeyoul Choi, Kyunghyun Cho, Yoshua Bengio

Neural machine translation (NMT) has been a new paradigm in machine translation, and the attention mechanism has become the dominant approach with the state-of-the-art records in many language pairs.

Machine Translation NMT +1

A Neural Knowledge Language Model

no code implementations1 Aug 2016 Sungjin Ahn, Heeyoul Choi, Tanel Pärnamaa, Yoshua Bengio

Current language models have a significant limitation in the ability to encode and decode factual knowledge.

Language Modelling

Context-Dependent Word Representation for Neural Machine Translation

1 code implementation3 Jul 2016 Heeyoul Choi, Kyunghyun Cho, Yoshua Bengio

Based on this observation, in this paper we propose to contextualize the word embedding vectors using a nonlinear bag-of-words representation of the source sentence.

Machine Translation Sentence +1

Cannot find the paper you are looking for? You can Submit a new open access paper.