In-Place Activated BatchNorm for Memory-Optimized Training of DNNs

CVPR 2018 Samuel Rota BulòLorenzo PorziPeter Kontschieder

In this work we present In-Place Activated Batch Normalization (InPlace-ABN) - a novel approach to drastically reduce the training memory footprint of modern deep neural networks in a computationally efficient way. Our solution substitutes the conventionally used succession of BatchNorm + Activation layers with a single plugin layer, hence avoiding invasive framework surgery while providing straightforward applicability for existing deep learning frameworks... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Results from Other Papers


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK USES EXTRA
TRAINING DATA
SOURCE PAPER COMPARE
Semantic Segmentation Cityscapes test Mapillary Mean IoU (class) 82.0% # 17

Methods used in the Paper