no code implementations • 16 Aug 2024 • Mohammadreza Samadi, Fred X. Han, Mohammad Salameh, Hao Wu, Fengyu Sun, Chunhua Zhou, Di Niu
This approach enables complex editing tasks, such as object movement, by aggregating multiple functions and applying them simultaneously to specific areas.
1 code implementation • CVPR 2024 • Keith G. Mills, Fred X. Han, Mohammad Salameh, Shengyao Lu, Chunhua Zhou, Jiao He, Fengyu Sun, Di Niu
Neural Architecture Search is a costly practice.
no code implementations • 21 Feb 2023 • Fred X. Han, Keith G. Mills, Fabian Chudak, Parsa Riahi, Mohammad Salameh, Jialin Zhang, Wei Lu, Shangling Jui, Di Niu
In this paper, we propose a general-purpose neural predictor for NAS that can transfer across search spaces, by representing any given candidate Convolutional Neural Network (CNN) with a Computation Graph (CG) that consists of primitive operators.
1 code implementation • 30 Nov 2022 • Keith G. Mills, Di Niu, Mohammad Salameh, Weichen Qiu, Fred X. Han, Puyuan Liu, Jialin Zhang, Wei Lu, Shangling Jui
Evaluating neural network performance is critical to deep neural network design but a costly procedure.
1 code implementation • 30 Nov 2022 • Keith G. Mills, Fred X. Han, Jialin Zhang, Fabian Chudak, Ali Safari Mamaghani, Mohammad Salameh, Wei Lu, Shangling Jui, Di Niu
In this paper, we propose GENNAPE, a Generalized Neural Architecture Performance Estimator, which is pretrained on open neural architecture benchmarks, and aims to generalize to completely unseen architectures through combined innovations in network representation, contrastive pretraining, and fuzzy clustering-based predictor ensemble.
no code implementations • 29 Sep 2021 • Fred X. Han, Fabian Chudak, Keith G Mills, Mohammad Salameh, Parsa Riahi, Jialin Zhang, Wei Lu, Shangling Jui, Di Niu
Understanding and modelling the performance of neural architectures is key to Neural Architecture Search (NAS).
1 code implementation • 25 Sep 2021 • Keith G. Mills, Fred X. Han, Jialin Zhang, SEYED SAEED CHANGIZ REZAEI, Fabian Chudak, Wei Lu, Shuo Lian, Shangling Jui, Di Niu
Neural architecture search automates neural network design and has achieved state-of-the-art results in many deep learning applications.
no code implementations • 25 Sep 2021 • Keith G. Mills, Fred X. Han, Mohammad Salameh, SEYED SAEED CHANGIZ REZAEI, Linglong Kong, Wei Lu, Shuo Lian, Shangling Jui, Di Niu
In this paper, we propose L$^{2}$NAS, which learns to intelligently optimize and update architecture hyperparameters via an actor neural network based on the distribution of high-performing architectures in the search history.
no code implementations • 19 May 2021 • SEYED SAEED CHANGIZ REZAEI, Fred X. Han, Di Niu, Mohammad Salameh, Keith Mills, Shuo Lian, Wei Lu, Shangling Jui
Despite the empirical success of neural architecture search (NAS) in deep learning applications, the optimality, reproducibility and cost of NAS schemes remain hard to assess.
no code implementations • 1 Jan 2021 • SEYED SAEED CHANGIZ REZAEI, Fred X. Han, Di Niu, Mohammad Salameh, Keith G Mills, Shangling Jui
Despite the empirical success of neural architecture search (NAS) algorithms in deep learning applications, the optimality, reproducibility and cost of NAS schemes remain hard to be assessed.
no code implementations • 1 Mar 2018 • Bang Liu, Ting Zhang, Fred X. Han, Di Niu, Kunfeng Lai, Yu Xu
The proposed sentence factorization technique leads to the invention of: 1) a new unsupervised distance metric which calculates the semantic distance between a pair of text snippets by solving a penalized optimal transport problem while preserving the logical relationship of words in the reordered sentences, and 2) new multi-scale deep learning models for supervised semantic training, based on factorized sentence hierarchies.