no code implementations • 7 Mar 2018 • Minkyu Choi, Takazumi Matsumoto, Minju Jung, Jun Tani
The current paper presents how a predictive coding type deep recurrent neural networks can generate vision-based goal-directed plans based on prior learning experience by examining experiment results using a real arm robot.
no code implementations • 16 Nov 2017 • Heechul Jung, Jeongwoo Ju, Minju Jung, Junmo Kim
Expanding the domain that deep neural network has already learned without accessing old domain data is a challenging task because deep neural networks forget previously learned information when learning new data from a new domain.
no code implementations • 24 May 2017 • Minju Jung, Haanvid Lee, Jun Tani
In this paper, inspired by the normalization and detrending methods, we propose adaptive detrending (AD) for temporal normalization in order to accelerate the training of ConvRNNs, especially for convolutional gated recurrent unit (ConvGRU).
no code implementations • 5 Feb 2016 • Haanvid Lee, Minju Jung, Jun Tani
The analysis of the internal representation obtained through the learning with the dataset clarifies what sorts of functional hierarchy can be developed by extracting the essential compositionality underlying the dataset.
no code implementations • 1 Jul 2016 • Heechul Jung, Jeongwoo Ju, Minju Jung, Junmo Kim
Surprisingly, our method is very effective to forget less of the information in the source domain, and we show the effectiveness of our method using several experiments.
no code implementations • 9 Jul 2015 • Jungsik Hwang, Minju Jung, Naveen Madapana, Jinhyung Kim, Minkyu Choi, Jun Tani
The current study examines how adequate coordination among different cognitive processes including visual recognition, attention switching, action preparation and generation can be developed via learning of robots by introducing a novel model, the Visuo-Motor Deep Dynamic Neural Network (VMDNN).
no code implementations • 12 Mar 2019 • Minju Jung, Takazumi Matsumoto, Jun Tani
Furthermore, our analysis of comparative experiments indicated that introduction of visual working memory and the inference mechanism using variational Bayes predictive coding significantly improve the performance in planning adequate goal-directed actions.
1 code implementation • 1 Jan 2021 • Minju Jung, Hyounguk Shon, Eojindl Yi, SungHyun Baek, Junmo Kim
For the pruning and retraining phase, whether the pruned-and-retrained network benefits from the pretrained network indded is examined.
no code implementations • NeurIPS Workshop SVRHM 2020 • Victor Boutin, Aimen Zerroug, Minju Jung, Thomas Serre
Our ability to generalize beyond training data to novel, out-of-distribution, image degradations is a hallmark of primate vision.
1 code implementation • 6 May 2021 • Matthew Ricci, Minju Jung, Yuwei Zhang, Mathieu Chalvidal, Aneri Soni, Thomas Serre
Here, we present a single approach to both of these problems in the form of "KuraNet", a deep-learning-based system of coupled oscillators that can learn to synchronize across a distribution of disordered network conditions.