1 code implementation • ICLR Workshop DeepDiffEq 2019 • Viktor Oganesyan, Alexandra Volokhova, Dmitry Vetrov
Stochastic regularization of neural networks (e. g. dropout) is a wide-spread technique in deep learning that allows for better generalization.