Search Results for author: Daehoon Gwak

Found 7 papers, 3 papers with code

Self-Supervised Contrastive Learning for Long-term Forecasting

1 code implementation3 Feb 2024 Junwoo Park, Daehoon Gwak, Jaegul Choo, Edward Choi

To this end, our contrastive loss incorporates global autocorrelation held in the whole time series, which facilitates the construction of positive and negative pairs in a self-supervised manner.

Contrastive Learning Time Series +1

Natural Attribute-based Shift Detection

no code implementations18 Oct 2021 Jeonghoon Park, Jimin Hong, Radhika Dua, Daehoon Gwak, Yixuan Li, Jaegul Choo, Edward Choi

Despite the impressive performance of deep networks in vision, language, and healthcare, unpredictable behaviors on samples from the distribution different than the training distribution cause severe problems in deployment.

Attribute Out of Distribution (OOD) Detection

Decoupled Kernel Neural Processes: Neural Network-Parameterized Stochastic Processes using Explicit Data-driven Kernel

no code implementations29 Sep 2021 Daehoon Gwak, Gyubok Lee, Jaehoon Lee, Jaesik Choi, Jaegul Choo, Edward Choi

To address this, we introduce a new neural stochastic processes, Decoupled Kernel Neural Processes (DKNPs), which explicitly learn a separate mean and kernel function to directly model the covariance between output variables in a data-driven manner.

Gaussian Processes

Standardized Max Logits: A Simple yet Effective Approach for Identifying Unexpected Road Obstacles in Urban-Scene Segmentation

1 code implementation ICCV 2021 Sanghun Jung, Jungsoo Lee, Daehoon Gwak, Sungha Choi, Jaegul Choo

However, the distribution of max logits of each predicted class is significantly different from each other, which degrades the performance of identifying unexpected objects in urban-scene segmentation.

Anomaly Detection Scene Segmentation +1

Neural Ordinary Differential Equations for Intervention Modeling

1 code implementation16 Oct 2020 Daehoon Gwak, Gyuhyeon Sim, Michael Poli, Stefano Massaroli, Jaegul Choo, Edward Choi

By interpreting the forward dynamics of the latent representation of neural networks as an ordinary differential equation, Neural Ordinary Differential Equation (Neural ODE) emerged as an effective framework for modeling a system dynamics in the continuous time domain.

Time Series Time Series Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.