Search Results for author: Jaesik Yoon

Found 12 papers, 2 papers with code

Learning Attentive Meta-Transfer

no code implementations ICML 2020 Jaesik Yoon, Gautam Singh, Sungjin Ahn

Meta-transfer learning seeks to improve the efficiency of learning a new task via both meta-learning and transfer-learning in a setting with a stream of evolving tasks.

Meta-Learning Transfer Learning

Spatially-Aware Transformer for Embodied Agents

no code implementations23 Feb 2024 Junmo Cho, Jaesik Yoon, Sungjin Ahn

Adopting this approach, we demonstrate that memory utilization efficiency can be improved, leading to enhanced accuracy in various place-centric downstream tasks.

reinforcement-learning

An Investigation into Pre-Training Object-Centric Representations for Reinforcement Learning

no code implementations9 Feb 2023 Jaesik Yoon, Yi-Fu Wu, Heechul Bae, Sungjin Ahn

In this paper, we investigate the effectiveness of OCR pre-training for image-based reinforcement learning via empirical experiments.

Object Optical Character Recognition (OCR) +5

TransDreamer: Reinforcement Learning with Transformer World Models

no code implementations19 Feb 2022 Chang Chen, Yi-Fu Wu, Jaesik Yoon, Sungjin Ahn

We then share this world model with a transformer-based policy network and obtain stability in training a transformer-based RL agent.

Model-based Reinforcement Learning reinforcement-learning +1

Generative Video Transformer: Can Objects be the Words?

no code implementations20 Jul 2021 Yi-Fu Wu, Jaesik Yoon, Sungjin Ahn

We compare our model with previous RNN-based approaches as well as other possible video transformer baselines.

Scene Understanding Video Generation

Robustifying Sequential Neural Processes

no code implementations29 Jun 2020 Jaesik Yoon, Gautam Singh, Sungjin Ahn

When tasks change over time, meta-transfer learning seeks to improve the efficiency of learning a new task via both meta-learning and transfer-learning.

Meta-Learning Transfer Learning

Attentive Sequential Neural Processes

no code implementations25 Sep 2019 Jaesik Yoon, Gautam Singh, Sungjin Ahn

In this paper, we propose the Attentive Sequential Neural Processes (ASNP) that resolve the underfitting in SNP by introducing a novel imaginary context as a latent variable and by applying attention over the imaginary context.

regression

Sequential Neural Processes

1 code implementation NeurIPS 2019 Gautam Singh, Jaesik Yoon, Youngsung Son, Sungjin Ahn

In this paper, we propose Sequential Neural Processes (SNP) which incorporates a temporal state-transition model of stochastic processes and thus extends its modeling capabilities to dynamic stochastic processes.

Gaussian Processes

One-Shot Learning for Text-to-SQL Generation

no code implementations26 Apr 2019 Dongjun Lee, Jaesik Yoon, Jongyun Song, Sang-gil Lee, Sungroh Yoon

We show that our model outperforms state-of-the-art approaches for various text-to-SQL datasets in two aspects: 1) the SQL generation accuracy for the trained templates, and 2) the adaptability to the unseen SQL templates based on a single example without any additional training.

One-Shot Learning Text-To-SQL

Bayesian Model-Agnostic Meta-Learning

2 code implementations NeurIPS 2018 Taesup Kim, Jaesik Yoon, Ousmane Dia, Sungwoong Kim, Yoshua Bengio, Sungjin Ahn

Learning to infer Bayesian posterior from a few-shot dataset is an important step towards robust meta-learning due to the model uncertainty inherent in the problem.

Active Learning Image Classification +2

Cannot find the paper you are looking for? You can Submit a new open access paper.