no code implementations • 21 Feb 2023 • Fred X. Han, Keith G. Mills, Fabian Chudak, Parsa Riahi, Mohammad Salameh, Jialin Zhang, Wei Lu, Shangling Jui, Di Niu
In this paper, we propose a general-purpose neural predictor for NAS that can transfer across search spaces, by representing any given candidate Convolutional Neural Network (CNN) with a Computation Graph (CG) that consists of primitive operators.
1 code implementation • 30 Nov 2022 • Keith G. Mills, Fred X. Han, Jialin Zhang, Fabian Chudak, Ali Safari Mamaghani, Mohammad Salameh, Wei Lu, Shangling Jui, Di Niu
In this paper, we propose GENNAPE, a Generalized Neural Architecture Performance Estimator, which is pretrained on open neural architecture benchmarks, and aims to generalize to completely unseen architectures through combined innovations in network representation, contrastive pretraining, and fuzzy clustering-based predictor ensemble.
no code implementations • 29 Sep 2021 • Fred X. Han, Fabian Chudak, Keith G Mills, Mohammad Salameh, Parsa Riahi, Jialin Zhang, Wei Lu, Shangling Jui, Di Niu
Understanding and modelling the performance of neural architectures is key to Neural Architecture Search (NAS).
1 code implementation • 25 Sep 2021 • Keith G. Mills, Fred X. Han, Jialin Zhang, SEYED SAEED CHANGIZ REZAEI, Fabian Chudak, Wei Lu, Shuo Lian, Shangling Jui, Di Niu
Neural architecture search automates neural network design and has achieved state-of-the-art results in many deep learning applications.
no code implementations • 14 Nov 2016 • Dmytro Korenkevych, Yanbo Xue, Zhengbing Bian, Fabian Chudak, William G. Macready, Jason Rolfe, Evgeny Andriyash
We argue that this relates to the fact that we are training a quantum rather than classical Boltzmann distribution in this case.