1 code implementation • 14 Oct 2021 • Michael Ruchte, Josif Grabocka
These works also use Multi-Task Learning (MTL) problems to benchmark MOO algorithms treating each task as independent objective.
1 code implementation • 24 Mar 2021 • Michael Ruchte, Josif Grabocka
Prior work either demand optimizing a new network for every point on the Pareto front, or induce a large overhead to the number of trainable parameters by using hyper-networks conditioned on modifiable preferences.
1 code implementation • 1 Jan 2021 • Michael Ruchte, Arber Zela, Julien Niklas Siems, Josif Grabocka, Frank Hutter
Neural Architecture Search (NAS) is one of the focal points for the Deep Learning community, but reproducing NAS methods is extremely challenging due to numerous low-level implementation details.