1 code implementation • NeurIPS 2021 • Guhyun Kim, Doo Seok Jeong
Backward propagation of errors (backpropagation) is a method to minimize objective functions (e. g., loss functions) of deep neural networks by identifying optimal sets of weights and biases.
no code implementations • 26 Nov 2019 • Vladimir Kornijcuk, Dohun Kim, Guhyun Kim, Doo Seok Jeong
We propose a model for synaptic plasticity based on a calcium signaling cascade.
no code implementations • 23 Nov 2017 • Guhyun Kim, Vladimir Kornijcuk, Dohun Kim, Inho Kim, Jaewook Kim, Hyo Cheon Woo, Ji Hun Kim, Cheol Seong Hwang, Doo Seok Jeong
In spite of remarkable progress in machine learning techniques, the state-of-the-art machine learning algorithms often keep machines from real-time learning (online learning) due in part to computational complexity in parameter optimization.