Training Deep Neural Networks by optimizing over nonlocal paths in hyperparameter space

ICLR 2020 Vlad PushkarovJonathan EfroniMykola MaksymenkoMaciej Koch-Janusz

Hyperparameter optimization is both a practical issue and an interesting theoretical problem in training of deep architectures. Despite many recent advances the most commonly used methods almost universally involve training multiple and decoupled copies of the model, in effect sampling the hyperparameter space... (read more)

PDF Abstract

Code


No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.