Semi-Supervised Learning with Ladder Networks

We combine supervised learning with unsupervised learning in deep neural networks. The proposed model is trained to simultaneously minimize the sum of supervised and unsupervised cost functions by backpropagation, avoiding the need for layer-wise pre-training. Our work builds on the Ladder network proposed by Valpola (2015), which we extend by combining the model with supervision. We show that the resulting model reaches state-of-the-art performance in semi-supervised MNIST and CIFAR-10 classification, in addition to permutation-invariant MNIST classification with all labels.

PDF Abstract NeurIPS 2015 PDF NeurIPS 2015 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Semi-Supervised Image Classification CIFAR-10, 4000 Labels Γ-model Percentage error 20.4 # 44

Methods


No methods listed for this paper. Add relevant methods here