2 code implementations • CVPR 2019 • Sungsoo Ahn, Shell Xu Hu, Andreas Damianou, Neil D. Lawrence, Zhenwen Dai
We further demonstrate the strength of our method on knowledge transfer across heterogeneous network architectures by transferring knowledge from a convolutional neural network (CNN) to a multi-layer perceptron (MLP) on CIFAR-10.
2 code implementations • 6 Jul 2020 • Junhyun Nam, Hyuntak Cha, Sungsoo Ahn, Jaeho Lee, Jinwoo Shin
Neural networks often learn to make predictions that overly rely on spurious correlation existing in the dataset, which causes the model to be biased.
Ranked #1 on Out-of-Distribution Generalization on ImageNet-W
1 code implementation • ICLR 2021 • Jaeho Lee, Sejun Park, Sangwoo Mo, Sungsoo Ahn, Jinwoo Shin
Recent discoveries on neural network pruning reveal that, with a carefully chosen layerwise sparsity, a simple magnitude-based pruning achieves state-of-the-art tradeoff between sparsity and performance.
1 code implementation • ICML 2020 • Sungsoo Ahn, Younggyo Seo, Jinwoo Shin
Designing efficient algorithms for combinatorial optimization appears ubiquitously in various scientific fields.
2 code implementations • NeurIPS 2020 • Sungsoo Ahn, Junsu Kim, Hankook Lee, Jinwoo Shin
De novo molecular design attempts to search over the chemical space for molecules with the desired property.
1 code implementation • 9 Jun 2021 • Junsu Kim, Sungsoo Ahn, Hankook Lee, Jinwoo Shin
Our main idea is based on a self-improving procedure that trains the model to imitate successful trajectories found by itself.
Ranked #4 on Multi-step retrosynthesis on USPTO-190
1 code implementation • 28 Feb 2023 • Sungbin Shin, Yohan Jo, Sungsoo Ahn, Namhoon Lee
Concept bottleneck models (CBMs) are a class of interpretable neural network models that predict the target response of a given input based on its high-level concepts.
1 code implementation • 20 Mar 2023 • Junsu Kim, Younggyo Seo, Sungsoo Ahn, Kyunghwan Son, Jinwoo Shin
Recently, graph-based planning algorithms have gained much attention to solve goal-conditioned reinforcement learning (RL) tasks: they provide a sequence of subgoals to reach the target-goal, and the agents learn to execute subgoal-conditioned policies.
1 code implementation • 22 Jun 2022 • Nayeong Kim, Sehyun Hwang, Sungsoo Ahn, Jaesik Park, Suha Kwak
We propose a new method for training debiased classifiers with no spurious attribute label.
2 code implementations • 4 Oct 2023 • Minsu Kim, Taeyoung Yun, Emmanuel Bengio, Dinghuai Zhang, Yoshua Bengio, Sungsoo Ahn, Jinkyoo Park
Generative Flow Networks (GFlowNets) are amortized sampling methods that learn a distribution over discrete objects proportional to their rewards.
1 code implementation • 4 Dec 2023 • Yunhui Jang, Seul Lee, Sungsoo Ahn
Recently, there has been a surge of interest in employing neural networks for graph generation, a fundamental statistical learning problem with critical applications like molecule design and community analysis.
1 code implementation • NeurIPS 2023 • Minsu Kim, Federico Berto, Sungsoo Ahn, Jinkyoo Park
The subsequent stage involves bootstrapping, which augments the training dataset with self-generated data labeled by a proxy score function.
1 code implementation • 30 May 2023 • Yunhui Jang, Dongwoo Kim, Sungsoo Ahn
Generating graphs from a target distribution is a significant challenge across many domains, including drug discovery and social network analysis.
1 code implementation • 5 Feb 2024 • Seongsu Kim, Sungsoo Ahn
This work studies machine learning for electron density prediction, which is fundamental for understanding chemical systems and density functional theory (DFT) simulations.
no code implementations • ICML 2018 • Sungsoo Ahn, Michael Chertkov, Adrian Weller, Jinwoo Shin
Probabilistic graphical models are a key tool in machine learning applications.
no code implementations • 5 Jan 2018 • Sungsoo Ahn, Michael Chertkov, Jinwoo Shin, Adrian Weller
Recently, so-called gauge transformations were used to improve variational lower bounds on $Z$.
no code implementations • NeurIPS 2017 • Sungsoo Ahn, Michael Chertkov, Jinwoo Shin
Computing partition function is the most important statistical inference task arising in applications of Graphical Models (GM).
no code implementations • 29 May 2016 • Sungsoo Ahn, Michael Chertkov, Jinwoo Shin
Furthermore, we also design an efficient rejection-free MCMC scheme for approximating the full series.
no code implementations • NeurIPS 2015 • Sungsoo Ahn, Sejun Park, Michael Chertkov, Jinwoo Shin
Max-product Belief Propagation (BP) is a popular message-passing algorithm for computing a Maximum-A-Posteriori (MAP) assignment over a distribution represented by a Graphical Model (GM).
no code implementations • 3 May 2021 • Hankook Lee, Sungsoo Ahn, Seung-Woo Seo, You Young Song, Eunho Yang, Sung-Ju Hwang, Jinwoo Shin
Retrosynthesis, of which the goal is to find a set of reactants for synthesizing a target product, is an emerging research area of deep learning.
no code implementations • 22 Jul 2021 • Sihyun Yu, Sangwoo Mo, Sungsoo Ahn, Jinwoo Shin
Abstract reasoning, i. e., inferring complicated patterns from given observations, is a central building block of artificial general intelligence.
no code implementations • ICLR 2022 • Jaehyung Kim, Dongyeop Kang, Sungsoo Ahn, Jinwoo Shin
Remarkably, our method is more effective on the challenging low-data and class-imbalanced regimes, and the learned augmentation policy is well-transferable to the different tasks and models.
no code implementations • ICLR 2022 • Sungsoo Ahn, Binghong Chen, Tianzhe Wang, Le Song
In this paper, we explore the problem of generating molecules using deep neural networks, which has recently gained much interest in chemistry.
no code implementations • NeurIPS 2021 • Sihyun Yu, Sungsoo Ahn, Le Song, Jinwoo Shin
We consider the problem of searching an input maximizing a black-box objective function given a static dataset of input-output queries.
no code implementations • 25 Sep 2019 • Sungsoo Ahn, Younggyo Seo, Jinwoo Shin
Designing efficient algorithms for combinatorial optimization appears ubiquitously in various scientific fields.
no code implementations • 15 Oct 2022 • Jiye Kim, Seungbeom Lee, Dongwoo Kim, Sungsoo Ahn, Jaesik Park
Designing a neural network architecture for molecular representation is crucial for AI-driven drug discovery and molecule design.
no code implementations • 2 Jun 2023 • Jaeseung Heo, Seungbeom Lee, Sungsoo Ahn, Dongwoo Kim
Graph-based models have become increasingly important in various domains, but the limited size and diversity of existing graph datasets often limit their performance.
no code implementations • 2 Jun 2023 • Hyeonah Kim, Minsu Kim, Sungsoo Ahn, Jinkyoo Park
Deep reinforcement learning (DRL) has significantly advanced the field of combinatorial optimization (CO).
no code implementations • 5 Oct 2023 • Hyosoon Jang, Minsu Kim, Sungsoo Ahn
In particular, we focus on improving GFlowNet with partial inference: training flow functions with the evaluation of the intermediate states or transitions.
no code implementations • 11 Oct 2023 • Seonghyun Park, Narae Ryu, Gahee Kim, Dongyeop Woo, Se-Young Yun, Sungsoo Ahn
In this work, we propose to resolve such a redundancy via the non-backtracking graph neural network (NBA-GNN) that updates a message without incorporating the message from the previously visited node.
no code implementations • 9 Dec 2023 • YeonJoon Jung, Sungsoo Ahn
In this work, we introduce a new graph neural network layer called Triplet Edge Attention (TEA), an edge-aware graph attention layer.
no code implementations • NeurIPS 2023 • Hyosoon Jang, Seonghyun Park, Sangwoo Mo, Sungsoo Ahn
This paper studies structured node classification on graphs, where the predictions should consider dependencies between the node labels.
no code implementations • 5 Feb 2024 • Hyomin Kim, Yunhui Jang, Jaeho Lee, Sungsoo Ahn
In this paper, we study hybrid neural representations for spherical data, a domain of increasing relevance in scientific research.