Search Results for author: Jungtaek Kim

Found 20 papers, 7 papers with code

Noise-Adaptive Confidence Sets for Linear Bandits and Application to Bayesian Optimization

no code implementations12 Feb 2024 Kwang-Sung Jun, Jungtaek Kim

First, we propose a novel confidence set that is `semi-adaptive' to the unknown sub-Gaussian parameter $\sigma_*^2$ in the sense that the (normalized) confidence width scales with $\sqrt{d\sigma_*^2 + \sigma_0^2}$ where $d$ is the dimension and $\sigma_0^2$ is the specified sub-Gaussian parameter (known) that can be much larger than $\sigma_*^2$.

Bayesian Optimization Decision Making +1

Beyond Regrets: Geometric Metrics for Bayesian Optimization

no code implementations3 Jan 2024 Jungtaek Kim

These metrics only rely on function evaluations, so that they do not consider geometric relationships between query points and global solutions, or query points themselves.

Bayesian Optimization Experimental Design

Generalized Neural Sorting Networks with Error-Free Differentiable Swap Functions

no code implementations11 Oct 2023 Jungtaek Kim, Jeongbeen Yoon, Minsu Cho

Sorting is a fundamental operation of all computer systems, having been a long-standing significant research topic.

Density Ratio Estimation-based Bayesian Optimization with Semi-Supervised Learning

no code implementations24 May 2023 Jungtaek Kim

Beyond the probabilistic regression-based Bayesian optimization, density ratio estimation-based Bayesian optimization has been suggested in order to estimate a density ratio of the groups relatively close and relatively far to a global optimum.

Bayesian Optimization Density Ratio Estimation +2

Sequential Brick Assembly with Efficient Constraint Satisfaction

no code implementations3 Oct 2022 Seokjun Ahn, Jungtaek Kim, Minsu Cho, Jaesik Park

The assembly problem is challenging since the number of possible structures increases exponentially with the number of available bricks, complicating the physical constraints to satisfy across bricks.

Bayesian Optimization Position

Learning to Assemble Geometric Shapes

1 code implementation24 May 2022 Jinhwi Lee, Jungtaek Kim, Hyunsoo Chung, Jaesik Park, Minsu Cho

Assembling parts into an object is a combinatorial problem that arises in a variety of contexts in the real world and involves numerous applications in science and engineering.

On Uncertainty Estimation by Tree-based Surrogate Models in Sequential Model-based Optimization

no code implementations22 Feb 2022 Jungtaek Kim, Seungjin Choi

Sequential model-based optimization sequentially selects a candidate point by constructing a surrogate model with the history of evaluations, to solve a black-box optimization problem.

On Evaluation Metrics for Graph Generative Models

1 code implementation ICLR 2022 Rylee Thompson, Boris Knyazev, Elahe Ghalebi, Jungtaek Kim, Graham W. Taylor

While we focus on applying these metrics to GGM evaluation, in practice this enables the ability to easily compute the dissimilarity between any two sets of graphs regardless of domain.

Computational Efficiency Image Generation +1

Brick-by-Brick: Combinatorial Construction with Deep Reinforcement Learning

no code implementations NeurIPS 2021 Hyunsoo Chung, Jungtaek Kim, Boris Knyazev, Jinhwi Lee, Graham W. Taylor, Jaesik Park, Minsu Cho

Discovering a solution in a combinatorial space is prevalent in many real-world problems but it is also challenging due to diverse complex constraints and the vast number of possible combinations.

Object reinforcement-learning +1

Combinatorial Bayesian Optimization with Random Mapping Functions to Convex Polytopes

no code implementations26 Nov 2020 Jungtaek Kim, Seungjin Choi, Minsu Cho

The main idea is to use a random mapping which embeds the combinatorial space into a convex polytope in a continuous space, on which all essential process is performed to determine a solution to the black-box optimization in the combinatorial space.

Bayesian Optimization

Fragment Relation Networks for Geometric Shape Assembly

no code implementations NeurIPS Workshop LMCA 2020 Jinhwi Lee, Jungtaek Kim, Hyunsoo Chung, Jaesik Park, Minsu Cho

Our model processes the candidate fragments in a permutation-equivariant manner and can generalize to cases with an arbitrary number of fragments and even with a different target object.

Object Relation

Bootstrapping Neural Processes

1 code implementation NeurIPS 2020 Juho Lee, Yoonho Lee, Jungtaek Kim, Eunho Yang, Sung Ju Hwang, Yee Whye Teh

While this "data-driven" way of learning stochastic processes has proven to handle various types of data, NPs still rely on an assumption that uncertainty in stochastic processes is modeled by a single latent variable, which potentially limits the flexibility.

Combinatorial 3D Shape Generation via Sequential Assembly

3 code implementations16 Apr 2020 Jungtaek Kim, Hyunsoo Chung, Jinhwi Lee, Minsu Cho, Jaesik Park

To alleviate this consequence induced by a huge number of feasible combinations, we propose a combinatorial 3D shape generation framework.

3D Shape Generation Bayesian Optimization

Bayesian Optimization with Approximate Set Kernels

no code implementations23 May 2019 Jungtaek Kim, Michael McCourt, Tackgeun You, Saehoon Kim, Seungjin Choi

We propose a practical Bayesian optimization method over sets, to minimize a black-box function that takes a set as a single input.

Bayesian Optimization

Practical Bayesian Optimization with Threshold-Guided Marginal Likelihood Maximization

1 code implementation18 May 2019 Jungtaek Kim, Seungjin Choi

We propose a practical Bayesian optimization method using Gaussian process regression, of which the marginal likelihood is maximized where the number of model selection steps is guided by a pre-defined threshold.

Bayesian Optimization Model Selection +1

MxML: Mixture of Meta-Learners for Few-Shot Classification

no code implementations11 Apr 2019 Minseop Park, Jungtaek Kim, Saehoon Kim, Yanbin Liu, Seungjin Choi

A meta-model is trained on a distribution of similar tasks such that it learns an algorithm that can quickly adapt to a novel task with only a handful of labeled examples.

Classification General Classification +1

On Local Optimizers of Acquisition Functions in Bayesian Optimization

no code implementations24 Jan 2019 Jungtaek Kim, Seungjin Choi

In practice, however, local optimizers of an acquisition function are also used, since searching for the global optimizer is often a non-trivial or time-consuming task.

Bayesian Optimization valid

Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks

9 code implementations1 Oct 2018 Juho Lee, Yoonho Lee, Jungtaek Kim, Adam R. Kosiorek, Seungjin Choi, Yee Whye Teh

Many machine learning tasks such as multiple instance learning, 3D shape recognition, and few-shot image classification are defined on sets of instances.

3D Shape Recognition Few-Shot Image Classification +1

Learning to Warm-Start Bayesian Hyperparameter Optimization

no code implementations17 Oct 2017 Jungtaek Kim, Saehoon Kim, Seungjin Choi

A simple alternative of manual search is random/grid search on a space of hyperparameters, which still undergoes extensive evaluations of validation errors in order to find its best configuration.

Bayesian Optimization Hyperparameter Optimization +1

Cannot find the paper you are looking for? You can Submit a new open access paper.