Search Results for author: Andrew Brock

Found 16 papers, 15 papers with code

Drawing Multiple Augmentation Samples Per Image During Training Efficiently Decreases Test Error

no code implementations27 May 2021 Stanislav Fort, Andrew Brock, Razvan Pascanu, Soham De, Samuel L. Smith

In this work, we provide a detailed empirical evaluation of how the number of augmentation samples per unique image influences model performance on held out data when training deep ResNets.

Data Augmentation Image Classification

Perceiver: General Perception with Iterative Attention

9 code implementations4 Mar 2021 Andrew Jaegle, Felix Gimeno, Andrew Brock, Andrew Zisserman, Oriol Vinyals, Joao Carreira

The perception models used in deep learning on the other hand are designed for individual modalities, often relying on domain-specific assumptions such as the local grid structures exploited by virtually all existing vision models.

3D Point Cloud Classification Audio Classification +1

High-Performance Large-Scale Image Recognition Without Normalization

15 code implementations11 Feb 2021 Andrew Brock, Soham De, Samuel L. Smith, Karen Simonyan

Batch normalization is a key component of most image classification models, but it has many undesirable properties stemming from its dependence on the batch size and interactions between examples.

Ranked #15 on Image Classification on ImageNet (using extra training data)

Image Classification

Characterizing signal propagation to close the performance gap in unnormalized ResNets

4 code implementations ICLR 2021 Andrew Brock, Soham De, Samuel L. Smith

Batch Normalization is a key component in almost all state-of-the-art image classifiers, but it also introduces practical challenges: it breaks the independence between training examples within a batch, can incur compute and memory overhead, and often results in unexpected bugs.

Training Generative Adversarial Networks by Solving Ordinary Differential Equations

1 code implementation NeurIPS 2020 Chongli Qin, Yan Wu, Jost Tobias Springenberg, Andrew Brock, Jeff Donahue, Timothy P. Lillicrap, Pushmeet Kohli

From this perspective, we hypothesise that instabilities in training GANs arise from the integration error in discretising the continuous dynamics.

Evolving Normalization-Activation Layers

8 code implementations NeurIPS 2020 Hanxiao Liu, Andrew Brock, Karen Simonyan, Quoc V. Le

Normalization layers and activation functions are fundamental components in deep networks and typically co-locate with each other.

Image Classification Image Generation +2

Large Scale GAN Training for High Fidelity Natural Image Synthesis

28 code implementations ICLR 2019 Andrew Brock, Jeff Donahue, Karen Simonyan

Despite recent progress in generative image modeling, successfully generating high-resolution, diverse samples from complex datasets such as ImageNet remains an elusive goal.

Conditional Image Generation

Implicit Weight Uncertainty in Neural Networks

3 code implementations3 Nov 2017 Nick Pawlowski, Andrew Brock, Matthew C. H. Lee, Martin Rajchl, Ben Glocker

Modern neural networks tend to be overconfident on unseen, noisy or incorrectly labelled data and do not produce meaningful uncertainty measures.

Normalising Flows

SMASH: One-Shot Model Architecture Search through HyperNetworks

1 code implementation ICLR 2018 Andrew Brock, Theodore Lim, J. M. Ritchie, Nick Weston

Designing architectures for deep neural networks requires expert knowledge and substantial computation time.

Neural Architecture Search

FreezeOut: Accelerate Training by Progressively Freezing Layers

2 code implementations15 Jun 2017 Andrew Brock, Theodore Lim, J. M. Ritchie, Nick Weston

The early layers of a deep neural net have the fewest parameters, but take up the most computation.

Neural Photo Editing with Introspective Adversarial Networks

1 code implementation22 Sep 2016 Andrew Brock, Theodore Lim, J. M. Ritchie, Nick Weston

The increasingly photorealistic sample quality of generative image models suggests their feasibility in applications beyond image generation.

Image Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.