Search Results for author: Simon Dufort-Labbé

Found 2 papers, 1 papers with code

Maxwell's Demon at Work: Efficient Pruning by Leveraging Saturation of Neurons

no code implementations12 Mar 2024 Simon Dufort-Labbé, Pierluca D'Oro, Evgenii Nikishin, Razvan Pascanu, Pierre-Luc Bacon, Aristide Baratin

When training deep neural networks, the phenomenon of $\textit{dying neurons}$ $\unicode{x2013}$units that become inactive or saturated, output zero during training$\unicode{x2013}$ has traditionally been viewed as undesirable, linked with optimization challenges, and contributing to plasticity loss in continual learning scenarios.

Continual Learning Model Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.