1 code implementation • ICLR 2022 • Yash Mehta, Colin White, Arber Zela, Arjun Krishnakumar, Guri Zabergja, Shakiba Moradian, Mahmoud Safari, Kaicheng Yu, Frank Hutter
The release of tabular benchmarks, such as NAS-Bench-101 and NAS-Bench-201, has significantly lowered the computational overhead for conducting scientific research in neural architecture search (NAS).
1 code implementation • 6 Oct 2022 • Arjun Krishnakumar, Colin White, Arber Zela, Renbo Tu, Mahmoud Safari, Frank Hutter
Zero-cost proxies (ZC proxies) are a recent architecture performance prediction technique aiming to significantly speed up algorithms for neural architecture search (NAS).
no code implementations • 7 Dec 2020 • Ali Davody, Mahmoud Safari, Răzvan V. Florian
We propose a new method of program learning in a Domain Specific Language (DSL) which is based on gradient descent with no direct search.
no code implementations • 20 Jan 2023 • Colin White, Mahmoud Safari, Rhea Sukthanker, Binxin Ru, Thomas Elsken, Arber Zela, Debadeepta Dey, Frank Hutter
Specialized, high-performing neural architectures are crucial to the success of deep learning in these areas.
Natural Language Understanding Neural Architecture Search +2
no code implementations • 16 Dec 2023 • Rhea Sanjay Sukthanker, Arjun Krishnakumar, Mahmoud Safari, Frank Hutter
%Due to the inherent differences in the structure of these search spaces, these Since weight-entanglement poses compatibility challenges for gradient-based NAS methods, these two paradigms have largely developed independently in parallel sub-communities.