Search Results for author: Jihoon Ko

Found 10 papers, 8 papers with code

TensorCodec: Compact Lossy Compression of Tensors without Strong Data Assumptions

1 code implementation19 Sep 2023 Taehyung Kwon, Jihoon Ko, Jinhong Jung, Kijung Shin

While many tensor compression algorithms are available, many of them rely on strong data assumptions regarding its order, sparsity, rank, and smoothness.

NeuKron: Constant-Size Lossy Compression of Sparse Reorderable Matrices and Tensors

1 code implementation9 Feb 2023 Taehyung Kwon, Jihoon Ko, Jinhong Jung, Kijung Shin

The updates take time linear in the number of non-zeros in the input matrix, and the approximation of each entry can be retrieved in logarithmic time.

BeGin: Extensive Benchmark Scenarios and An Easy-to-use Framework for Graph Continual Learning

1 code implementation26 Nov 2022 Jihoon Ko, Shinhwan Kang, Taehyung Kwon, Heechan Moon, Kijung Shin

Compared to them, however, CL methods for graph data (graph CL) are relatively underexplored because of (a) the lack of standard experimental settings, especially regarding how to deal with the dependency between instances, (b) the lack of benchmark datasets and scenarios, and (c) high complexity in implementation and evaluation due to the dependency.

Continual Learning

Deep-Learning-Based Precipitation Nowcasting with Ground Weather Station Data and Radar Data

1 code implementation20 Oct 2022 Jihoon Ko, Kyuhan Lee, Hyunjin Hwang, Kijung Shin

Recently, many deep-learning techniques have been applied to various weather-related prediction tasks, including precipitation nowcasting (i. e., predicting precipitation levels and locations in the near future).

Effective Training Strategies for Deep-learning-based Precipitation Nowcasting and Estimation

no code implementations17 Feb 2022 Jihoon Ko, Kyuhan Lee, Hyunjin Hwang, Seok-Geun Oh, Seok-Woo Son, Kijung Shin

It is highlighted that our pre-training scheme and new loss function improve the critical success index (CSI) of nowcasting of heavy rainfall (at least 10 mm/hr) by up to 95. 7% and 43. 6%, respectively, at a 5-hr lead time.

Learning to Pool in Graph Neural Networks for Extrapolation

no code implementations11 Jun 2021 Jihoon Ko, Taehyung Kwon, Kijung Shin, Juho Lee

However, according to a recent study, a careful choice of pooling functions, which are used for the aggregation and readout operations in GNNs, is crucial for enabling GNNs to extrapolate.

Evolution of Real-world Hypergraphs: Patterns and Models without Oracles

1 code implementation28 Aug 2020 Yunbum Kook, Jihoon Ko, Kijung Shin

What kind of macroscopic structural and dynamical patterns can we observe in real-world hypergraphs?

Social and Information Networks

SSumM: Sparse Summarization of Massive Graphs

2 code implementations1 Jun 2020 Kyuhan Lee, Hyeonsoo Jo, Jihoon Ko, Sungsu Lim, Kijung Shin

SSumM not only merges nodes together but also sparsifies the summary graph, and the two strategies are carefully balanced based on the minimum description length principle.

Databases Social and Information Networks H.2.8

Hypergraph Motifs: Concepts, Algorithms, and Discoveries

2 code implementations4 Mar 2020 Geon Lee, Jihoon Ko, Kijung Shin

(Q3) how can we identify domains which hypergraphs are from?

Social and Information Networks Databases Data Structures and Algorithms H.2.8

MONSTOR: An Inductive Approach for Estimating and Maximizing Influence over Unseen Networks

1 code implementation24 Jan 2020 Jihoon Ko, Kyuhan Lee, Kijung Shin, Noseong Park

In this work, we present an inductive machine learning method, called Monte Carlo Simulator (MONSTOR), for estimating the influence of given seed nodes in social networks unseen during training.

Cannot find the paper you are looking for? You can Submit a new open access paper.