Search Results for author: Lionel Eyraud-Dubois

Found 3 papers, 0 papers with code

Survey on Large Scale Neural Network Training

no code implementations21 Feb 2022 Julia Gusak, Daria Cherniuk, Alena Shilova, Alexander Katrutsa, Daniel Bershatsky, Xunyi Zhao, Lionel Eyraud-Dubois, Oleg Shlyazhko, Denis Dimitrov, Ivan Oseledets, Olivier Beaumont

Modern Deep Neural Networks (DNNs) require significant memory to store weight, activations, and other intermediate tensors during training.

Efficient Combination of Rematerialization and Offloading for Training DNNs

no code implementations NeurIPS 2021 Olivier Beaumont, Lionel Eyraud-Dubois, Alena Shilova

Rematerialization and offloading are two well known strategies to save memory during the training phase of deep neural networks, allowing data scientists to consider larger models, batch sizes or higher resolution data.

valid

Optimal checkpointing for heterogeneous chains: how to train deep neural networks with limited memory

no code implementations27 Nov 2019 Julien Herrmann, Olivier Beaumont, Lionel Eyraud-Dubois, Julien Hermann, Alexis Joly, Alena Shilova

This paper introduces a new activation checkpointing method which allows to significantly decrease memory usage when training Deep Neural Networks with the back-propagation algorithm.

Cannot find the paper you are looking for? You can Submit a new open access paper.