Search Results for author: Byung-Hoon Kim

Found 11 papers, 5 papers with code

Learning Dynamic Graph Representation of Brain Connectome with Spatio-Temporal Attention

2 code implementations NeurIPS 2021 Byung-Hoon Kim, Jong Chul Ye, Jae-Jin Kim

Here, we propose STAGIN, a method for learning dynamic graph representation of the brain connectome with spatio-temporal attention.

Understanding Graph Isomorphism Network for rs-fMRI Functional Connectivity Analysis

1 code implementation10 Jan 2020 Byung-Hoon Kim, Jong Chul Ye

This understanding enables us to exploit CNN-based saliency map techniques for the GNN, which we tailor to the proposed GIN with one-hot encoding, to visualize the important regions of the brain.

General Classification Graph Classification

PyNET-CA: Enhanced PyNET with Channel Attention for End-to-End Mobile Image Signal Processing

1 code implementation7 Apr 2021 Byung-Hoon Kim, Joonyoung Song, Jong Chul Ye, JaeHyun Baek

Reconstructing RGB image from RAW data obtained with a mobile device is related to a number of image signal processing (ISP) tasks, such as demosaicing, denoising, etc.

Demosaicking Denoising

A Generative Self-Supervised Framework using Functional Connectivity in fMRI Data

no code implementations4 Dec 2023 JungWon Choi, Seongho Keum, Eunggu Yun, Byung-Hoon Kim, Juho Lee

Deep neural networks trained on Functional Connectivity (FC) networks extracted from functional Magnetic Resonance Imaging (fMRI) data have gained popularity due to the increasing availability of data and advances in model architectures, including Graph Neural Network (GNN).

Self-Supervised Learning

Joint-Embedding Masked Autoencoder for Self-supervised Learning of Dynamic Functional Connectivity from the Human Brain

no code implementations11 Mar 2024 JungWon Choi, Hyungi Lee, Byung-Hoon Kim, Juho Lee

Although generative self-supervised learning techniques, especially masked autoencoders, have shown promising results in representation learning in various domains, their application to dynamic graphs for dynamic functional connectivity remains underexplored, facing challenges in capturing high-level semantic representations.

Representation Learning Self-Supervised Learning

ERD: A Framework for Improving LLM Reasoning for Cognitive Distortion Classification

no code implementations21 Mar 2024 Sehee Lim, Yejin Kim, Chi-Hyun Choi, Jy-yong Sohn, Byung-Hoon Kim

Improving the accessibility of psychotherapy with the aid of Large Language Models (LLMs) is garnering a significant attention in recent years.

Specificity

Cannot find the paper you are looking for? You can Submit a new open access paper.