2 code implementations • 13 Sep 2024 • Seonkyu Lim, Jeongwhan Choi, Noseong Park, Sang-Ha Yoon, ShinHyuck Kang, Young-Min Kim, Hyunjoong Kang
However, DFMs face two main challenges: i) the lack of capturing economic uncertainties such as sudden recessions or booms, and ii) the limitation of capturing irregular dynamics from mixed-frequency data.
1 code implementation • 17 Jul 2024 • Jeongeun Lee, SeongKu Kang, Won-Yong Shin, Jeongwhan Choi, Noseong Park, Dongha Lee
Cross-domain recommendation (CDR) extends conventional recommender systems by leveraging user-item interactions from dense domains to mitigate data sparsity and the cold start problem.
1 code implementation • 6 Jun 2024 • Jeongwhan Choi, Sumin Park, Hyowon Wi, Sung-Bae Cho, Noseong Park
Recent research in the field of graph neural network (GNN) has identified a critical issue known as "over-squashing," resulting from the bottleneck phenomenon in graph structures, which impedes the propagation of long-range information.
Ranked #1 on Graph Classification on REDDIT-BINARY
2 code implementations • 8 May 2024 • Seoyoung Hong, Jeongwhan Choi, Yeon-Chang Lee, Srijan Kumar, Noseong Park
However, existing methods still have room to improve the trade-offs among accuracy, efficiency, and robustness.
Ranked #1 on Recommendation Systems on Yelp2018 (HR@10 metric)
no code implementations • 1 May 2024 • Chaejeong Lee, Jeongwhan Choi, Hyowon Wi, Sung-Bae Cho, Noseong Park
In this paper, we propose a novel Stochastic sampling for i) COntrastive views and ii) hard NEgative samples (SCONE) to overcome these issues.
no code implementations • 6 Jan 2024 • Jeongwhan Choi, Duksan Ryu
We propose a novel approach called QoS-aware graph contrastive learning (QAGCL) for web service recommendation.
no code implementations • 27 Dec 2023 • Jeongwhan Choi, Hyowon Wi, Chaejeong Lee, Sung-Bae Cho, Dongha Lee, Noseong Park
In this paper, inspired by the reaction-diffusion equation, we propose a novel CL method for recommender systems called the reaction-diffusion graph contrastive learning model (RDGCL).
1 code implementation • 19 Dec 2023 • Youn-Yeol Yu, Jeongwhan Choi, Woojin Cho, Kookjin Lee, Nayong Kim, Kiseok Chang, Chang-Seung Woo, Ilho Kim, Seok-Woo Lee, Joon-Young Yang, Sooyoung Yoon, Noseong Park
These methods are typically designed to i) reduce the computational cost in solving physical dynamics and/or ii) propose techniques to enhance the solution accuracy in fluid and rigid body dynamics.
Ranked #1 on Physical Simulations on Deformable Plate
2 code implementations • 16 Dec 2023 • Yehjin Shin, Jeongwhan Choi, Hyowon Wi, Noseong Park
In the SR domain, we, for the first time, show that the same problem occurs.
Ranked #1 on Sequential Recommendation on LastFM
no code implementations • 12 Dec 2023 • Jayoung Kim, Yehjin Shin, Jeongwhan Choi, Hyowon Wi, Noseong Park
Structured data, which constitutes a significant portion of existing data types, has been a long-standing research topic in the field of machine learning.
no code implementations • 7 Dec 2023 • Jeongwhan Choi, Hyowon Wi, Jayoung Kim, Yehjin Shin, Kookjin Lee, Nathaniel Trask, Noseong Park
We propose a graph-filter-based self-attention (GFSA) to learn a general yet effective one, whose complexity, however, is slightly larger than that of the original self-attention mechanism.
Ranked #1 on Speech Recognition on LibriSpeech 100h test-other
no code implementations • 8 Nov 2023 • Seonkyu Lim, Jaehyeon Park, Seojin Kim, Hyowon Wi, Haksoo Lim, Jinsung Jeon, Jeongwhan Choi, Noseong Park
Long-term time series forecasting (LTSF) is a challenging task that has been investigated in various domains such as finance investment, health care, traffic, and weather forecasting.
2 code implementations • 20 Mar 2023 • Jeongwhan Choi, Noseong Park
A prevalent approach in the field is to combine graph convolutional networks and recurrent neural networks for the spatio-temporal processing.
Ranked #5 on Traffic Prediction on PeMSD3
2 code implementations • Knowledge and Information Systems 2023 • Hwangyong Choi, Jeongwhan Choi, Jeehyun Hwang, Kookjin Lee, Dongeun Lee, Noseong Park
Owing to the remarkable development of deep learning technology, there have been a series of efforts to build deep learning-based climate models.
1 code implementation • 25 Nov 2022 • Jeongwhan Choi, Seoyoung Hong, Noseong Park, Sung-Bae Cho
In particular, diffusion equations have been widely used for designing the core processing layer of GNNs, and therefore they are inevitably vulnerable to the notorious oversmoothing problem.
no code implementations • 22 Nov 2022 • Jaehoon Lee, Chan Kim, Gyumin Lee, Haksoo Lim, Jeongwhan Choi, Kookjin Lee, Dongeun Lee, Sanghyun Hong, Noseong Park
Forecasting future outcomes from recent time series data is not easy, especially when the future data are different from the past (i. e. time series are under temporal drifts).
1 code implementation • 17 Nov 2022 • Jeongwhan Choi, Seoyoung Hong, Noseong Park, Sung-Bae Cho
Various methods have been proposed for collaborative filtering, ranging from matrix factorization to graph convolutional methods.
Ranked #1 on Collaborative Filtering on Amazon-Book
2 code implementations • 30 Aug 2022 • Seoyoung Hong, Heejoo Shin, Jeongwhan Choi, Noseong Park
Owing to the continuous and bijective characteristics of NODEs, in addition, we design a one-shot price optimization method given a pre-trained prediction model, which requires only one iteration to find the optimal solution.
1 code implementation • 7 Dec 2021 • Jeongwhan Choi, Hwangyong Choi, Jeehyun Hwang, Noseong Park
A prevalent approach in the field is to combine graph convolutional networks and recurrent neural networks for the spatio-temporal processing.
2 code implementations • 14 Nov 2021 • Taeyong Kong, Taeri Kim, Jinsung Jeon, Jeongwhan Choi, Yeon-Chang Lee, Noseong Park, Sang-Wook Kim
To our knowledge, we are the first who design a hybrid method and report the correlation between the graph centrality and the linearity/non-linearity of nodes.
2 code implementations • 11 Nov 2021 • Jeehyun Hwang, Jeongwhan Choi, Hwangyong Choi, Kookjin Lee, Dongeun Lee, Noseong Park
On the other hand, neural ordinary differential equations (NODEs) are to learn a latent governing equation of ODE from data.
2 code implementations • 8 Aug 2021 • Jeongwhan Choi, Jinsung Jeon, Noseong Park
In this work, we extend them based on neural ordinary differential equations (NODEs), because the linear GCN concept can be interpreted as a differential equation, and present the method of Learnable-Time ODE-based Collaborative Filtering (LT-OCF).
Ranked #1 on Recommendation Systems on Amazon-book