Search Results for author: Viktor Oganesyan

Found 1 papers, 1 papers with code

Stochasticity in Neural ODEs: An Empirical Study

1 code implementation ICLR Workshop DeepDiffEq 2019 Viktor Oganesyan, Alexandra Volokhova, Dmitry Vetrov

Stochastic regularization of neural networks (e. g. dropout) is a wide-spread technique in deep learning that allows for better generalization.

Data Augmentation Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.