Search Results for author: Mahmoud Safari

Found 6 papers, 2 papers with code

SuperCoder: Program Learning Under Noisy Conditions From Superposition of States

no code implementations7 Dec 2020 Ali Davody, Mahmoud Safari, Răzvan V. Florian

We propose a new method of program learning in a Domain Specific Language (DSL) which is based on gradient descent with no direct search.

NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy

1 code implementation ICLR 2022 Yash Mehta, Colin White, Arber Zela, Arjun Krishnakumar, Guri Zabergja, Shakiba Moradian, Mahmoud Safari, Kaicheng Yu, Frank Hutter

The release of tabular benchmarks, such as NAS-Bench-101 and NAS-Bench-201, has significantly lowered the computational overhead for conducting scientific research in neural architecture search (NAS).

Image Classification Neural Architecture Search +4

NAS-Bench-Suite-Zero: Accelerating Research on Zero Cost Proxies

1 code implementation6 Oct 2022 Arjun Krishnakumar, Colin White, Arber Zela, Renbo Tu, Mahmoud Safari, Frank Hutter

Zero-cost proxies (ZC proxies) are a recent architecture performance prediction technique aiming to significantly speed up algorithms for neural architecture search (NAS).

Neural Architecture Search

Weight-Entanglement Meets Gradient-Based Neural Architecture Search

no code implementations16 Dec 2023 Rhea Sanjay Sukthanker, Arjun Krishnakumar, Mahmoud Safari, Frank Hutter

%Due to the inherent differences in the structure of these search spaces, these Since weight-entanglement poses compatibility challenges for gradient-based NAS methods, these two paradigms have largely developed independently in parallel sub-communities.

Neural Architecture Search

Surprisingly Strong Performance Prediction with Neural Graph Features

no code implementations25 Apr 2024 Gabriela Kadlecová, Jovita Lukasik, Martin Pilát, Petra Vidnerová, Mahmoud Safari, Roman Neruda, Frank Hutter

Performance prediction has been a key part of the neural architecture search (NAS) process, allowing to speed up NAS algorithms by avoiding resource-consuming network training.

Cannot find the paper you are looking for? You can Submit a new open access paper.