no code implementations • 20 Feb 2024 • Jinsung Jeon, Hyundong Jin, Jonghyun Choi, Sanghyun Hong, Dongeun Lee, Kookjin Lee, Noseong Park
Extensively evaluating methods with seven image recognition benchmarks, we show that the proposed PAC-FNO improves the performance of existing baseline models on images with various resolutions by up to 77. 1% and various types of natural variations in the images at inference.
1 code implementation • 19 Dec 2023 • Youn-Yeol Yu, Jeongwhan Choi, Woojin Cho, Kookjin Lee, Nayong Kim, Kiseok Chang, Chang-Seung Woo, Ilho Kim, Seok-Woo Lee, Joon-Young Yang, Sooyoung Yoon, Noseong Park
These methods are typically designed to i) reduce the computational cost in solving physical dynamics and/or ii) propose techniques to enhance the solution accuracy in fluid and rigid body dynamics.
no code implementations • 16 Dec 2023 • Woojin Cho, Seunghyeon Cho, Hyundong Jin, Jinsung Jeon, Kookjin Lee, Sanghyun Hong, Dongeun Lee, Jonghyun Choi, Noseong Park
Neural ordinary differential equations (NODEs), one of the most influential works of the differential equation-based deep learning, are to continuously generalize residual networks and opened a new field.
no code implementations • 7 Dec 2023 • Jeongwhan Choi, Hyowon Wi, Jayoung Kim, Yehjin Shin, Kookjin Lee, Nathaniel Trask, Noseong Park
Transformers, renowned for their self-attention mechanism, have achieved state-of-the-art performance across various tasks in natural language processing, computer vision, time-series modeling, etc.
1 code implementation • NeurIPS 2023 • Anthony Gruber, Kookjin Lee, Nathaniel Trask
Recent works have shown that physics-inspired architectures allow the training of deep graph neural networks (GNNs) without oversmoothing.
no code implementations • 22 Nov 2022 • Jaehoon Lee, Chan Kim, Gyumin Lee, Haksoo Lim, Jeongwhan Choi, Kookjin Lee, Dongeun Lee, Sanghyun Hong, Noseong Park
Forecasting future outcomes from recent time series data is not easy, especially when the future data are different from the past (i. e. time series are under temporal drifts).
no code implementations • 10 Oct 2022 • Fan Wu, Sanghyun Hong, Donsub Rim, Noseong Park, Kookjin Lee
However, parameterization of dynamics using a neural network makes it difficult for humans to identify causal structures in the data.
no code implementations • 1 Oct 2022 • Kookjin Lee, Nathaniel Trask
In this study, we propose parameter-varying neural ordinary differential equations (NODEs) where the evolution of model parameters is represented by partition-of-unity networks (POUNets), a mixture of experts architecture.
1 code implementation • 13 Jul 2022 • Suneghyeon Cho, Sanghyun Hong, Kookjin Lee, Noseong Park
In this work, we propose adaptive momentum estimation neural ODEs (AdamNODEs) that adaptively control the acceleration of the classical momentum-based approach.
no code implementations • 7 Feb 2022 • Nathaniel Trask, Carianne Martinez, Kookjin Lee, Brad Boyce
We introduce physics-informed multimodal autoencoders (PIMA) - a variational inference framework for discovering shared information in multimodal scientific datasets representative of high-throughput testing.
2 code implementations • 11 Nov 2021 • Jeehyun Hwang, Jeongwhan Choi, Hwangyong Choi, Kookjin Lee, Dongeun Lee, Noseong Park
On the other hand, neural ordinary differential equations (NODEs) are to learn a latent governing equation of ODE from data.
no code implementations • 29 Sep 2021 • Jungeun Kim, Seunghyun Hwang, Jeehyun Hwang, Kookjin Lee, Dongeun Lee, Noseong Park
In other words, the knowledge contained by the learned governing equation can be injected into the neural network which approximates the PDE solution function.
no code implementations • 11 Sep 2021 • Kookjin Lee, Nathaniel Trask, Panos Stinis
Discovery of dynamical systems from data forms the foundation for data-driven modeling and recently, structure-preserving geometric perspectives have been shown to provide improved forecasting, stability, and physical realizability guarantees.
no code implementations • 7 Jul 2021 • Nat Trask, Mamikon Gulian, Andy Huang, Kookjin Lee
We enrich POU-Nets with a Gaussian noise model to obtain a probabilistic generalization amenable to gradient-based minimization of a maximum likelihood loss.
no code implementations • NeurIPS 2021 • Kookjin Lee, Nathaniel A. Trask, Panos Stinis
Forecasting of time-series data requires imposition of inductive biases to obtain predictive extrapolation, and recent works have imposed Hamiltonian/Lagrangian form to preserve structure for systems with reversible dynamics.
no code implementations • 27 Jan 2021 • Kookjin Lee, Nathaniel A. Trask, Ravi G. Patel, Mamikon A. Gulian, Eric C. Cyr
Approximation theorists have established best-in-class optimal approximation rates of deep neural networks by utilizing their ability to simultaneously emulate partitions of unity and monomials.
no code implementations • 1 Jan 2021 • Jungeun Kim, Seunghyun Hwang, Jihyun Hwang, Kookjin Lee, Dongeun Lee, Noseong Park
Neural ordinary differential equations (neural ODEs) introduced an approach to approximate a neural network as a system of ODEs after considering its layer as a continuous variable and discretizing its hidden dimension.
1 code implementation • 4 Dec 2020 • Jungeun Kim, Kookjin Lee, Dongeun Lee, Sheo Yon Jin, Noseong Park
We present a method for learning dynamics of complex physical processes described by time-dependent nonlinear partial differential equations (PDEs).
no code implementations • 28 Oct 2020 • Kookjin Lee, Eric J. Parish
This work proposes an extension of neural ordinary differential equations (NODEs) by introducing an additional set of ODE input parameters to NODEs.
no code implementations • 21 Sep 2019 • Kookjin Lee, Kevin Carlberg
In contrast to existing methods for latent dynamics learning, this is the only method that both employs a nonlinear embedding and computes dynamics for the latent state that guarantee the satisfaction of prescribed physical properties.
Computational Physics
no code implementations • 11 Jun 2019 • Duanshun Li, Jing Liu, Noseong Park, Dongeun Lee, Giridhar Ramachandran, Ali Seyedmazloom, Kookjin Lee, Chen Feng, Vadim Sokolov, Rajesh Ganesan
0-1 knapsack is of fundamental importance in computer science, business, operations research, etc.
no code implementations • 26 Jul 2017 • Noseong Park, Ankesh Anand, Joel Ruben Antony Moniz, Kookjin Lee, Tanmoy Chakraborty, Jaegul Choo, Hongkyu Park, Young-Min Kim
MMGAN finds two manifolds representing the vector representations of real and fake images.