1 code implementation • 12 Feb 2024 • Kwang-Sung Jun, Jungtaek Kim

First, we propose a novel confidence set that is `semi-adaptive' to the unknown sub-Gaussian parameter $\sigma_*^2$ in the sense that the (normalized) confidence width scales with $\sqrt{d\sigma_*^2 + \sigma_0^2}$ where $d$ is the dimension and $\sigma_0^2$ is the specified sub-Gaussian parameter (known) that can be much larger than $\sigma_*^2$.

no code implementations • 3 Jan 2024 • Jungtaek Kim

These metrics only rely on function evaluations, so that they do not consider geometric relationships between query points and global solutions, or query points themselves.

1 code implementation • NeurIPS 2023 • Jungtaek Kim, Mingxuan Li, Oliver Hinder, Paul W. Leu

To design and understand these nanophotonic structures, electrodynamic simulations are essential.

no code implementations • 11 Oct 2023 • Jungtaek Kim, Jeongbeen Yoon, Minsu Cho

Sorting is a fundamental operation of all computer systems, having been a long-standing significant research topic.

no code implementations • 24 May 2023 • Jungtaek Kim

Beyond the probabilistic regression-based Bayesian optimization, density ratio estimation-based Bayesian optimization has been suggested in order to estimate a density ratio of the groups relatively close and relatively far to a global optimum.

no code implementations • 3 Oct 2022 • Seokjun Ahn, Jungtaek Kim, Minsu Cho, Jaesik Park

The assembly problem is challenging since the number of possible structures increases exponentially with the number of available bricks, complicating the physical constraints to satisfy across bricks.

1 code implementation • 24 May 2022 • Jinhwi Lee, Jungtaek Kim, Hyunsoo Chung, Jaesik Park, Minsu Cho

Assembling parts into an object is a combinatorial problem that arises in a variety of contexts in the real world and involves numerous applications in science and engineering.

no code implementations • 22 Feb 2022 • Jungtaek Kim, Seungjin Choi

Sequential model-based optimization sequentially selects a candidate point by constructing a surrogate model with the history of evaluations, to solve a black-box optimization problem.

1 code implementation • ICLR 2022 • Rylee Thompson, Boris Knyazev, Elahe Ghalebi, Jungtaek Kim, Graham W. Taylor

While we focus on applying these metrics to GGM evaluation, in practice this enables the ability to easily compute the dissimilarity between any two sets of graphs regardless of domain.

no code implementations • NeurIPS 2021 • Hyunsoo Chung, Jungtaek Kim, Boris Knyazev, Jinhwi Lee, Graham W. Taylor, Jaesik Park, Minsu Cho

Discovering a solution in a combinatorial space is prevalent in many real-world problems but it is also challenging due to diverse complex constraints and the vast number of possible combinations.

no code implementations • 26 Nov 2020 • Jungtaek Kim, Seungjin Choi, Minsu Cho

The main idea is to use a random mapping which embeds the combinatorial space into a convex polytope in a continuous space, on which all essential process is performed to determine a solution to the black-box optimization in the combinatorial space.

no code implementations • NeurIPS Workshop LMCA 2020 • Jinhwi Lee, Jungtaek Kim, Hyunsoo Chung, Jaesik Park, Minsu Cho

Our model processes the candidate fragments in a permutation-equivariant manner and can generalize to cases with an arbitrary number of fragments and even with a different target object.

1 code implementation • NeurIPS 2020 • Juho Lee, Yoonho Lee, Jungtaek Kim, Eunho Yang, Sung Ju Hwang, Yee Whye Teh

While this "data-driven" way of learning stochastic processes has proven to handle various types of data, NPs still rely on an assumption that uncertainty in stochastic processes is modeled by a single latent variable, which potentially limits the flexibility.

3 code implementations • 16 Apr 2020 • Jungtaek Kim, Hyunsoo Chung, Jinhwi Lee, Minsu Cho, Jaesik Park

To alleviate this consequence induced by a huge number of feasible combinations, we propose a combinatorial 3D shape generation framework.

no code implementations • 23 May 2019 • Jungtaek Kim, Michael McCourt, Tackgeun You, Saehoon Kim, Seungjin Choi

We propose a practical Bayesian optimization method over sets, to minimize a black-box function that takes a set as a single input.

1 code implementation • 18 May 2019 • Jungtaek Kim, Seungjin Choi

We propose a practical Bayesian optimization method using Gaussian process regression, of which the marginal likelihood is maximized where the number of model selection steps is guided by a pre-defined threshold.

no code implementations • 11 Apr 2019 • Minseop Park, Jungtaek Kim, Saehoon Kim, Yanbin Liu, Seungjin Choi

A meta-model is trained on a distribution of similar tasks such that it learns an algorithm that can quickly adapt to a novel task with only a handful of labeled examples.

no code implementations • 24 Jan 2019 • Jungtaek Kim, Seungjin Choi

In practice, however, local optimizers of an acquisition function are also used, since searching for the global optimizer is often a non-trivial or time-consuming task.

9 code implementations • 1 Oct 2018 • Juho Lee, Yoonho Lee, Jungtaek Kim, Adam R. Kosiorek, Seungjin Choi, Yee Whye Teh

Many machine learning tasks such as multiple instance learning, 3D shape recognition, and few-shot image classification are defined on sets of instances.

no code implementations • 17 Oct 2017 • Jungtaek Kim, Saehoon Kim, Seungjin Choi

A simple alternative of manual search is random/grid search on a space of hyperparameters, which still undergoes extensive evaluations of validation errors in order to find its best configuration.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.