1 code implementation • NeurIPS 2021 • Guhyun Kim, Doo Seok Jeong
Backward propagation of errors (backpropagation) is a method to minimize objective functions (e. g., loss functions) of deep neural networks by identifying optimal sets of weights and biases.
3 code implementations • 27 Sep 2021 • Jason K. Eshraghian, Max Ward, Emre Neftci, Xinxin Wang, Gregor Lenz, Girish Dwivedi, Mohammed Bennamoun, Doo Seok Jeong, Wei D. Lu
This paper serves as a tutorial and perspective showing how to apply the lessons learnt from several decades of research in deep learning, gradient descent, backpropagation and neuroscience to biologically plausible spiking neural neural networks.
no code implementations • 26 Nov 2019 • Vladimir Kornijcuk, Dohun Kim, Guhyun Kim, Doo Seok Jeong
We propose a model for synaptic plasticity based on a calcium signaling cascade.
no code implementations • 23 Nov 2017 • Guhyun Kim, Vladimir Kornijcuk, Dohun Kim, Inho Kim, Jaewook Kim, Hyo Cheon Woo, Ji Hun Kim, Cheol Seong Hwang, Doo Seok Jeong
In spite of remarkable progress in machine learning techniques, the state-of-the-art machine learning algorithms often keep machines from real-time learning (online learning) due in part to computational complexity in parameter optimization.