The Reversible Residual Network: Backpropagation Without Storing Activations

NeurIPS 2017 Aidan N. GomezMengye RenRaquel UrtasunRoger B. Grosse

Deep residual networks (ResNets) have significantly pushed forward the state-of-the-art on image classification, increasing in performance as networks grow both deeper and wider. However, memory consumption becomes a bottleneck, as one needs to store the activations in order to calculate gradients using backpropagation... (read more)

PDF Abstract

Evaluation results from the paper


  Submit results from this paper to get state-of-the-art GitHub badges and help community compare results to other papers.