Search Results for author: Zanlin Ni

Found 3 papers, 2 papers with code

Revisiting Locally Supervised Learning: an Alternative to End-to-end Training

1 code implementation26 Jan 2021 Yulin Wang, Zanlin Ni, Shiji Song, Le Yang, Gao Huang

Due to the need to store the intermediate activations for back-propagation, end-to-end (E2E) training of deep networks usually suffers from high GPUs memory footprint.

Revisiting Locally Supervised Training of Deep Neural Networks

no code implementations ICLR 2021 Yulin Wang, Zanlin Ni, Shiji Song, Le Yang, Gao Huang

As InfoPro loss is difficult to compute in its original form, we derive a feasible upper bound as a surrogate optimization objective, yielding a simple but effective algorithm.

Cannot find the paper you are looking for? You can Submit a new open access paper.