Search Results for author: Eunggu Yun

Found 7 papers, 3 papers with code

On-Off Pattern Encoding and Path-Count Encoding as Deep Neural Network Representations

no code implementations17 Jan 2024 Euna Jung, Jaekeol Choi, Eunggu Yun, Wonjong Rhee

Specifically, we consider \textit{On-Off pattern} and \textit{PathCount} for investigating how information is stored in deep representations.

Image Classification

A Generative Self-Supervised Framework using Functional Connectivity in fMRI Data

no code implementations4 Dec 2023 JungWon Choi, Seongho Keum, Eunggu Yun, Byung-Hoon Kim, Juho Lee

Deep neural networks trained on Functional Connectivity (FC) networks extracted from functional Magnetic Resonance Imaging (fMRI) data have gained popularity due to the increasing availability of data and advances in model architectures, including Graph Neural Network (GNN).

Self-Supervised Learning

Probabilistic Imputation for Time-series Classification with Missing Data

1 code implementation13 Aug 2023 SeungHyun Kim, Hyunsu Kim, Eunggu Yun, Hwangrae Lee, Jaehun Lee, Juho Lee

In this paper, we propose a novel probabilistic framework for classification with multivariate time series data with missing values.

Imputation Time Series +1

Traversing Between Modes in Function Space for Fast Ensembling

1 code implementation20 Jun 2023 Eunggu Yun, Hyungi Lee, Giung Nam, Juho Lee

While this provides a way to efficiently train ensembles, for inference, multiple forward passes should still be executed using all the ensemble parameters, which often becomes a serious bottleneck for real-world deployment.

Martingale Posterior Neural Processes

no code implementations19 Apr 2023 Hyungi Lee, Eunggu Yun, Giung Nam, Edwin Fong, Juho Lee

Based on this result, instead of assuming any form of the latent variables, we equip a NP with a predictive distribution implicitly defined with neural networks and use the corresponding martingale posteriors as the source of uncertainty.

Bayesian Inference Gaussian Processes

Scale Mixtures of Neural Network Gaussian Processes

1 code implementation ICLR 2022 Hyungi Lee, Eunggu Yun, Hongseok Yang, Juho Lee

We show that simply introducing a scale prior on the last-layer parameters can turn infinitely-wide neural networks of any architecture into a richer class of stochastic processes.

Gaussian Processes

Cannot find the paper you are looking for? You can Submit a new open access paper.