Search Results for author: Kyle Helfrich

Found 3 papers, 2 papers with code

Batch Normalization Preconditioning for Neural Network Training

no code implementations2 Aug 2021 Susanna Lange, Kyle Helfrich, Qiang Ye

Batch normalization (BN) is a popular and ubiquitous method in deep learning that has been shown to decrease training time and improve generalization performance of neural networks.

Eigenvalue Normalized Recurrent Neural Networks for Short Term Memory

1 code implementation18 Nov 2019 Kyle Helfrich, Qiang Ye

Several variants of recurrent neural networks (RNNs) with orthogonal or unitary recurrent matrices have recently been developed to mitigate the vanishing/exploding gradient problem and to model long-term dependencies of sequences.

Orthogonal Recurrent Neural Networks with Scaled Cayley Transform

2 code implementations ICML 2018 Kyle Helfrich, Devin Willmott, Qiang Ye

Recurrent Neural Networks (RNNs) are designed to handle sequential data but suffer from vanishing or exploding gradients.

Cannot find the paper you are looking for? You can Submit a new open access paper.