no code implementations • 2 Aug 2021 • Susanna Lange, Kyle Helfrich, Qiang Ye
Batch normalization (BN) is a popular and ubiquitous method in deep learning that has been shown to decrease training time and improve generalization performance of neural networks.
1 code implementation • 18 Nov 2019 • Kyle Helfrich, Qiang Ye
Several variants of recurrent neural networks (RNNs) with orthogonal or unitary recurrent matrices have recently been developed to mitigate the vanishing/exploding gradient problem and to model long-term dependencies of sequences.
2 code implementations • ICML 2018 • Kyle Helfrich, Devin Willmott, Qiang Ye
Recurrent Neural Networks (RNNs) are designed to handle sequential data but suffer from vanishing or exploding gradients.