Search Results for author: Tom Madams

Found 1 papers, 1 papers with code

Meta-Learning Bidirectional Update Rules

1 code implementation10 Apr 2021 Mark Sandler, Max Vladymyrov, Andrey Zhmoginov, Nolan Miller, Andrew Jackson, Tom Madams, Blaise Aguera y Arcas

We show that classical gradient-based backpropagation in neural networks can be seen as a special case of a two-state network where one state is used for activations and another for gradients, with update rules derived from the chain rule.

Meta-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.