1 code implementation • 22 Jun 2022 • Hongjoon Ahn, Yongyi Yang, Quan Gan, Taesup Moon, David Wipf
Moreover, the complexity of this trade-off is compounded in the heterogeneous graph case due to the disparate heterophily relationships between nodes of different types.
no code implementations • ICCV 2021 • Hongjoon Ahn, Jihwan Kwak, Subin Lim, Hyeonsu Bang, Hyojun Kim, Taesup Moon
To that end, we analyze that computing the softmax probabilities by combining the output scores for all old and new classes could be the main cause of the bias.
no code implementations • NeurIPS 2020 • Sangwon Jung, Hongjoon Ahn, Sungmin Cha, Taesup Moon
We propose a novel regularization-based continual learning method, dubbed as Adaptive Group Sparsity based Continual Learning (AGS-CL), using two group sparsity-based penalties.
1 code implementation • NeurIPS 2019 • Hongjoon Ahn, Sungmin Cha, DongGyu Lee, Taesup Moon
We introduce a new neural network-based continual learning algorithm, dubbed as Uncertainty-regularized Continual Learning (UCL), which builds on traditional Bayesian online learning framework with variational inference.
Ranked #11 on
Continual Learning
on ASC (19 tasks)
no code implementations • 24 Feb 2019 • Hongjoon Ahn, Taesup Moon
We propose a novel iterative channel estimation (ICE) algorithm that essentially removes the critical known noisy channel assumption for universal discrete denoising problem.