LaplaceNet: A Hybrid Graph-Energy Neural Network for Deep Semi-Supervised Classification

8 Jun 2021  ·  Philip Sellars, Angelica I. Aviles-Rivero, Carola-Bibiane Schönlieb ·

Semi-supervised learning has received a lot of recent attention as it alleviates the need for large amounts of labelled data which can often be expensive, requires expert knowledge and be time consuming to collect. Recent developments in deep semi-supervised classification have reached unprecedented performance and the gap between supervised and semi-supervised learning is ever-decreasing. This improvement in performance has been based on the inclusion of numerous technical tricks, strong augmentation techniques and costly optimisation schemes with multi-term loss functions. We propose a new framework, LaplaceNet, for deep semi-supervised classification that has a greatly reduced model complexity. We utilise a hybrid approach where pseudolabels are produced by minimising the Laplacian energy on a graph. These pseudo-labels are then used to iteratively train a neural-network backbone. Our model outperforms state-of-the art methods for deep semi-supervised classification, over several benchmark datasets. Furthermore, we consider the application of strong-augmentations to neural networks theoretically and justify the use of a multi-sampling approach for semi-supervised learning. We demonstrate, through rigorous experimentation, that a multi-sampling augmentation approach improves generalisation and reduces the sensitivity of the network to augmentation.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Semi-Supervised Image Classification cifar-100, 10000 Labels LaplaceNet (WRN-28-8) Percentage error 22.11± 0.23 # 12
Semi-Supervised Image Classification CIFAR-10, 4000 Labels LaplaceNet (CNN-13) Percentage error 4.99±0.08 # 18
Semi-Supervised Image Classification CIFAR-10, 4000 Labels LaplaceNet (WRN-28-2) Percentage error 4.35±0.10 # 15

Methods


No methods listed for this paper. Add relevant methods here