Unshuffling Data for Improved Generalization

The inability to generalize beyond the distribution of a training set is at the core of practical limits of machine learning. We show that the common practice of mixing and shuffling training examples when training deep neural networks is not optimal... (read more)

Results in Papers With Code
(↓ scroll down to see all results)