Search Results for author: Milad Alizadeh

Found 8 papers, 5 papers with code

Prospect Pruning: Finding Trainable Weights at Initialization using Meta-Gradients

1 code implementation ICLR 2022 Milad Alizadeh, Shyam A. Tailor, Luisa M Zintgraf, Joost van Amersfoort, Sebastian Farquhar, Nicholas Donald Lane, Yarin Gal

Pruning neural networks at initialization would enable us to find sparse models that retain the accuracy of the original network while consuming fewer computational resources for training and inference.

COIN++: Neural Compression Across Modalities

1 code implementation30 Jan 2022 Emilien Dupont, Hrushikesh Loya, Milad Alizadeh, Adam Goliński, Yee Whye Teh, Arnaud Doucet

Neural compression algorithms are typically based on autoencoders that require specialized encoder and decoder architectures for different data modalities.

COIN: COmpression with Implicit Neural representations

1 code implementation ICLR Workshop Neural_Compression 2021 Emilien Dupont, Adam Goliński, Milad Alizadeh, Yee Whye Teh, Arnaud Doucet

We propose a new simple approach for image compression: instead of storing the RGB values for each pixel of an image, we store the weights of a neural network overfitted to the image.

Data Compression Image Compression

Single Shot Structured Pruning Before Training

no code implementations1 Jul 2020 Joost van Amersfoort, Milad Alizadeh, Sebastian Farquhar, Nicholas Lane, Yarin Gal

We introduce a method to speed up training by 2x and inference by 3x in deep neural networks using structured pruning applied before training.

Gradient $\ell_1$ Regularization for Quantization Robustness

no code implementations ICLR 2020 Milad Alizadeh, Arash Behboodi, Mart van Baalen, Christos Louizos, Tijmen Blankevoort, Max Welling

We analyze the effect of quantizing weights and activations of neural networks on their loss and derive a simple regularization scheme that improves robustness against post-training quantization.

Quantization

A Systematic Comparison of Bayesian Deep Learning Robustness in Diabetic Retinopathy Tasks

1 code implementation22 Dec 2019 Angelos Filos, Sebastian Farquhar, Aidan N. Gomez, Tim G. J. Rudner, Zachary Kenton, Lewis Smith, Milad Alizadeh, Arnoud de Kroon, Yarin Gal

From our comparison we conclude that some current techniques which solve benchmarks such as UCI `overfit' their uncertainty to the dataset---when evaluated on our benchmark these underperform in comparison to simpler baselines.

Out-of-Distribution Detection

An Empirical study of Binary Neural Networks' Optimisation

1 code implementation ICLR 2019 Milad Alizadeh, Javier Fernández-Marqués, Nicholas D. Lane, Yarin Gal

In this work, we empirically identify and study the effectiveness of the various ad-hoc techniques commonly used in the literature, providing best-practices for efficient training of binary models.

BinaryFlex: On-the-Fly Kernel Generation in Binary Convolutional Networks

no code implementations ICLR 2018 Vincent W.-S. Tseng, Sourav Bhattachary, Javier Fernández Marqués, Milad Alizadeh, Catherine Tong, Nicholas Donald Lane

In this work we present BinaryFlex, a neural network architecture that learns weighting coefficients of predefined orthogonal binary basis instead of the conventional approach of learning directly the convolutional filters.

Cannot find the paper you are looking for? You can Submit a new open access paper.