Search Results for author: Jaesik Choi

Found 24 papers, 3 papers with code

Conditional Temporal Neural Processes with Covariance Loss

1 code implementation ICML 2021 Boseon Yoo, Jiwoo Lee, Janghoon Ju, Seijun Chung, Soyeon Kim, Jaesik Choi

We introduce a novel loss function, Covariance Loss, which is conceptually equivalent to conditional neural processes and has a form of regularization so that is applicable to many kinds of neural networks.

Time Series Forecasting Traffic Prediction

Automatic Correction of Internal Units in Generative Neural Networks

no code implementations CVPR 2021 Ali Tousi, Haedong Jeong, Jiyeon Han, Hwanil Choi, Jaesik Choi

Generative Adversarial Networks (GANs) have shown satisfactory performance in synthetic image generation by devising complex network structure and adversarial training scheme.

Image Generation

Learning Compositional Sparse Gaussian Processes with a Shrinkage Prior

no code implementations21 Dec 2020 Anh Tong, Toan Tran, Hung Bui, Jaesik Choi

Choosing a proper set of kernel functions is an important problem in learning Gaussian Process (GP) models since each kernel structure has different model complexity and data fitness.

Gaussian Processes Time Series

Interpreting Deep Neural Networks with Relative Sectional Propagation by Analyzing Comparative Gradients and Hostile Activations

no code implementations7 Dec 2020 Woo-Jeoung Nam, Jaesik Choi, Seong-Whan Lee

As a result, it is possible to assign the bi-polar relevance scores of the target (positive) and hostile (negative) attributions while maintaining each attribution aligned with the importance.

Characterizing Deep Gaussian Processes via Nonlinear Recurrence Systems

no code implementations19 Oct 2020 Anh Tong, Jaesik Choi

Recent advances in Deep Gaussian Processes (DGPs) show the potential to have more expressive representation than that of traditional Gaussian Processes (GPs).

Gaussian Processes

Improved Predictive Deep Temporal Neural Networks with Trend Filtering

no code implementations16 Oct 2020 YoungJin Park, Deokjun Eom, Byoungki Seo, Jaesik Choi

We reveal that the predictive performance of deep temporal neural networks improves when the training data is temporally processed by a trend filtering.

Time Series

Interpretation of Deep Temporal Representations by Selective Visualization of Internally Activated Nodes

no code implementations27 Apr 2020 Sohee Cho, Ginkyeng Lee, Wonjoon Chang, Jaesik Choi

Recently deep neural networks demonstrate competitive performances in classification and regression tasks for many temporal or sequential data.

General Classification

An Efficient Explorative Sampling Considering the Generative Boundaries of Deep Generative Neural Networks

no code implementations12 Dec 2019 Giyoung Jeon, Haedong Jeong, Jaesik Choi

Despite of recent advances in generative networks, identifying the image generation mechanism still remains challenging.

Image Generation

Why Do Masked Neural Language Models Still Need Common Sense Knowledge?

no code implementations8 Nov 2019 Sunjae Kwon, Cheongwoong Kang, Jiyeon Han, Jaesik Choi

From the test, we observed that MNLMs partially understand various types of common sense knowledge but do not accurately understand the semantic meaning of relations.

Common Sense Reasoning Question Answering

A Single Multi-Task Deep Neural Network with Post-Processing for Object Detection with Reasoning and Robotic Grasp Detection

no code implementations16 Sep 2019 Dongwon Park, Yonghyeok Seo, Dongju Shin, Jaesik Choi, Se Young Chun

Recently, robotic grasp detection (GD) and object detection (OD) with reasoning have been investigated using deep neural networks (DNNs).

Object Detection

Relative Attributing Propagation: Interpreting the Comparative Contributions of Individual Units in Deep Neural Networks

1 code implementation1 Apr 2019 Woo-Jeoung Nam, Shir Gur, Jaesik Choi, Lior Wolf, Seong-Whan Lee

As Deep Neural Networks (DNNs) have demonstrated superhuman performance in a variety of fields, there is an increasing interest in understanding the complex internal mechanisms of DNNs.

Deep Reinforcement Learning in Continuous Action Spaces: a Case Study in the Game of Simulated Curling

1 code implementation ICML 2018 Kyowoon Lee, Sol-A Kim, Jaesik Choi, Seong-Whan Lee

Many real-world applications of reinforcement learning require an agent to select optimal actions from continuous spaces.

Parametric Information Bottleneck to Optimize Stochastic Neural Networks

no code implementations ICLR 2018 Thanh T. Nguyen, Jaesik Choi

Here, we propose Parametric Information Bottleneck (PIB) for a neural network by utilizing (only) its model parameters explicitly to approximate the compression and the relevance.

Layer-wise Learning of Stochastic Neural Networks with Information Bottleneck

no code implementations4 Dec 2017 Thanh T. Nguyen, Jaesik Choi

Information Bottleneck (IB) is a generalization of rate-distortion theory that naturally incorporates compression and relevance trade-offs for learning.

Adversarial Robustness

Grouped Convolutional Neural Networks for Multivariate Time Series

no code implementations29 Mar 2017 Subin Yi, Janghoon Ju, Man-Ki Yoon, Jaesik Choi

In experiments with two real-world datasets, we demonstrate that our group CNNs outperform existing CNN based regression methods.

Anomaly Detection Time Series

Discovering Latent Covariance Structures for Multiple Time Series

no code implementations28 Mar 2017 Anh Tong, Jaesik Choi

In this paper, we present a new GP model which naturally handles multiple time series by placing an Indian Buffet Process (IBP) prior on the presence of shared kernels.

Time Series Time Series Analysis

Make Hawkes Processes Explainable by Decomposing Self-Triggering Kernels

no code implementations27 Mar 2017 Rafael Lima, Jaesik Choi

We demonstrate that the new automatic kernel decomposition procedure outperforms the existing methods on the prediction of discrete events in real-world data.

Automatic Generation of Probabilistic Programming from Time Series Data

no code implementations4 Jul 2016 Anh Tong, Jaesik Choi

In this paper, we provide a new perspective to build expressive probabilistic program from continue time series data when the structure of model is not given.

Probabilistic Programming Time Series

Searching for Topological Symmetry in Data Haystack

no code implementations11 Mar 2016 Kallol Roy, Anh Tong, Jaesik Choi

To compute the symmetry in a grid structure, we introduce three legal grid moves (i) Commutation (ii) Cyclic Permutation (iii) Stabilization on sets of local grid squares, grid blocks.

Global Deconvolutional Networks for Semantic Segmentation

no code implementations12 Feb 2016 Vladimir Nekrasov, Janghoon Ju, Jaesik Choi

Semantic image segmentation is a principal problem in computer vision, where the aim is to correctly classify each individual pixel of an image into a semantic label.

Autonomous Driving Image Classification +2

The Automatic Statistician: A Relational Perspective

no code implementations26 Nov 2015 Yunseong Hwang, Anh Tong, Jaesik Choi

Gaussian Processes (GPs) provide a general and analytically tractable way of modeling complex time-varying, nonparametric functions.

Gaussian Processes Time Series

Cannot find the paper you are looking for? You can Submit a new open access paper.