Search Results for author: Dong Bok Lee

Found 9 papers, 7 papers with code

Self-Supervised Dataset Distillation for Transfer Learning

2 code implementations10 Oct 2023 Dong Bok Lee, Seanie Lee, Joonho Ko, Kenji Kawaguchi, Juho Lee, Sung Ju Hwang

To achieve this, we also introduce the MSE between representations of the inner model and the self-supervised target model on the original full dataset for outer optimization.

Bilevel Optimization Meta-Learning +3

Learning Transferable Adversarial Robust Representations via Multi-view Consistency

no code implementations19 Oct 2022 Minseon Kim, Hyeonjeong Ha, Dong Bok Lee, Sung Ju Hwang

Despite the success on few-shot learning problems, most meta-learned models only focus on achieving good performance on clean examples and thus easily break down when given adversarially perturbed samples.

Adversarial Attack Adversarial Robustness +4

Dataset Condensation with Latent Space Knowledge Factorization and Sharing

1 code implementation21 Aug 2022 Hae Beom Lee, Dong Bok Lee, Sung Ju Hwang

In this paper, we introduce a novel approach for systematically solving dataset condensation problem in an efficient manner by exploiting the regularity in a given dataset.

Dataset Condensation

MOG: Molecular Out-of-distribution Generation with Energy-based Models

no code implementations29 Sep 2021 Seul Lee, Dong Bok Lee, Sung Ju Hwang

To validate the ability to explore the chemical space beyond the known molecular distribution, we experiment with MOG to generate molecules of high absolute values of docking score, which is the affinity score based on a physical binding simulation between a target protein and a given molecule.

Drug Discovery Out of Distribution (OOD) Detection

Meta-StyleSpeech : Multi-Speaker Adaptive Text-to-Speech Generation

2 code implementations6 Jun 2021 Dongchan Min, Dong Bok Lee, Eunho Yang, Sung Ju Hwang

In this work, we propose StyleSpeech, a new TTS model which not only synthesizes high-quality speech but also effectively adapts to new speakers.

Meta-GMVAE: Mixture of Gaussian VAE for Unsupervised Meta-Learning

1 code implementation ICLR 2021 Dong Bok Lee, Dongchan Min, Seanie Lee, Sung Ju Hwang

Then, the learned model can be used for downstream few-shot classification tasks, where we obtain task-specific parameters by performing semi-supervised EM on the latent representations of the support and query set, and predict labels of the query set by computing aggregated posteriors.

Meta-Learning Unsupervised Few-Shot Image Classification +2

Contrastive Learning with Adversarial Perturbations for Conditional Text Generation

1 code implementation ICLR 2021 Seanie Lee, Dong Bok Lee, Sung Ju Hwang

In this work, we propose to mitigate the conditional text generation problem by contrasting positive pairs with negative pairs, such that the model is exposed to various valid or incorrect perturbations of the inputs, for improved generalization.

Conditional Text Generation Contrastive Learning +5

Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph Link Prediction

1 code implementation NeurIPS 2020 Jinheon Baek, Dong Bok Lee, Sung Ju Hwang

For transductive link prediction, we further propose a stochastic embedding layer to model uncertainty in the link prediction between unseen entities.

graph construction Knowledge Graph Completion +2

Generating Diverse and Consistent QA pairs from Contexts with Information-Maximizing Hierarchical Conditional VAEs

1 code implementation ACL 2020 Dong Bok Lee, Seanie Lee, Woo Tae Jeong, Donghwan Kim, Sung Ju Hwang

We validate our Information Maximizing Hierarchical Conditional Variational AutoEncoder (Info-HCVAE) on several benchmark datasets by evaluating the performance of the QA model (BERT-base) using only the generated QA pairs (QA-based evaluation) or by using both the generated and human-labeled pairs (semi-supervised learning) for training, against state-of-the-art baseline models.

Question-Answer-Generation Question Answering +1

Cannot find the paper you are looking for? You can Submit a new open access paper.