1 code implementation • 2 Feb 2024 • Hyunjin Seo, Jihun Yun, Eunho Yang
Since the pioneering work on the lottery ticket hypothesis for graph neural networks (GNNs) was proposed in Chen et al. (2021), the study on finding graph lottery tickets (GLT) has become one of the pivotal focus in the GNN community, inspiring researchers to discover sparser GLT while achieving comparable performance to original dense networks.
no code implementations • NeurIPS 2021 • Jihun Yun, Aurelie C. Lozano, Eunho Yang
We consider the training of structured neural networks where the regularizer can be non-smooth and possibly non-convex.
no code implementations • ICCV 2021 • Jung Hyun Lee, Jihun Yun, Sung Ju Hwang, Eunho Yang
Network quantization, which aims to reduce the bit-lengths of the network weights and activations, has emerged for their deployments to resource-limited devices.
no code implementations • 1 Jan 2021 • Jung Hyun Lee, Jihun Yun, Sung Ju Hwang, Eunho Yang
As a natural extension of DropBits, we further introduce the way of learning heterogeneous quantization levels to find proper bit-length for each layer using DropBits.
no code implementations • 15 Jul 2020 • Jihun Yun, Aurelie C. Lozano, Eunho Yang
We propose a unified framework for stochastic proximal gradient descent, which we term ProxGen, that allows for arbitrary positive preconditioners and lower semi-continuous regularizers.
no code implementations • 29 Nov 2019 • Jung Hyun Lee, Jihun Yun, Sung Ju Hwang, Eunho Yang
As a natural extension of DropBits, we further introduce the way of learning heterogeneous quantization levels to find proper bit-length for each layer using DropBits.
no code implementations • 26 May 2019 • Jihun Yun, Aurelie C. Lozano, Eunho Yang
Extensive experiments reveal that block-diagonal approaches achieve state-of-the-art results on several deep learning tasks, and can outperform adaptive diagonal methods, vanilla Sgd, as well as a modified version of full-matrix adaptation proposed very recently.