1 code implementation • ICLR 2022 • Shangyuan Tong, Timur Garipov, Yang Zhang, Shiyu Chang, Tommi S. Jaakkola
Furthermore, we show that our approach can be viewed as a limit of existing notions of alignment by increasing transportation assignment tolerance.
1 code implementation • 20 Feb 2020 • Shangyuan Tong, Timur Garipov, Tommi Jaakkola
We provide sufficient conditions for local convergence; characterize the capacity balance that should guide the discriminator and generator choices; and construct examples of minimally sufficient discriminators.
no code implementations • ICLR 2020 • Diego Granziol, Timur Garipov, Dmitry Vetrov, Stefan Zohren, Stephen Roberts, Andrew Gordon Wilson
This approach is an order of magnitude faster than state-of-the-art methods for spectral visualization, and can be generically used to investigate the spectral properties of matrices in deep learning.
1 code implementation • 20 Dec 2019 • Diego Granziol, Xingchen Wan, Timur Garipov
We present MLRG Deep Curvature suite, a PyTorch-based, open-source package for analysis and visualisation of neural network curvature and loss landscape.
1 code implementation • 17 Jul 2019 • Pavel Izmailov, Wesley J. Maddox, Polina Kirichenko, Timur Garipov, Dmitry Vetrov, Andrew Gordon Wilson
Bayesian inference was once a gold standard for learning with neural networks, providing accurate full predictive distributions and well calibrated uncertainty.
8 code implementations • NeurIPS 2019 • Wesley Maddox, Timur Garipov, Pavel Izmailov, Dmitry Vetrov, Andrew Gordon Wilson
We propose SWA-Gaussian (SWAG), a simple, scalable, and general purpose approach for uncertainty representation and calibration in deep learning.
15 code implementations • 14 Mar 2018 • Pavel Izmailov, Dmitrii Podoprikhin, Timur Garipov, Dmitry Vetrov, Andrew Gordon Wilson
Deep neural networks are typically trained by optimizing a loss function with an SGD variant, in conjunction with a decaying learning rate, until convergence.
Ranked #78 on Image Classification on CIFAR-100 (using extra training data)
10 code implementations • NeurIPS 2018 • Timur Garipov, Pavel Izmailov, Dmitrii Podoprikhin, Dmitry Vetrov, Andrew Gordon Wilson
The loss functions of deep neural networks are complex and their geometric properties are not well understood.
no code implementations • 20 Feb 2018 • Max Kochurov, Timur Garipov, Dmitry Podoprikhin, Dmitry Molchanov, Arsenii Ashukha, Dmitry Vetrov
In industrial machine learning pipelines, data often arrive in parts.
2 code implementations • 10 Nov 2016 • Timur Garipov, Dmitry Podoprikhin, Alexander Novikov, Dmitry Vetrov
Convolutional neural networks excel in image recognition tasks, but this comes at the cost of high computational and memory complexity.