Search Results for author: Fred X. Han

Found 10 papers, 4 papers with code

A General-Purpose Transferable Predictor for Neural Architecture Search

no code implementations21 Feb 2023 Fred X. Han, Keith G. Mills, Fabian Chudak, Parsa Riahi, Mohammad Salameh, Jialin Zhang, Wei Lu, Shangling Jui, Di Niu

In this paper, we propose a general-purpose neural predictor for NAS that can transfer across search spaces, by representing any given candidate Convolutional Neural Network (CNN) with a Computation Graph (CG) that consists of primitive operators.

Contrastive Learning Graph Representation Learning +1

GENNAPE: Towards Generalized Neural Architecture Performance Estimators

1 code implementation30 Nov 2022 Keith G. Mills, Fred X. Han, Jialin Zhang, Fabian Chudak, Ali Safari Mamaghani, Mohammad Salameh, Wei Lu, Shangling Jui, Di Niu

In this paper, we propose GENNAPE, a Generalized Neural Architecture Performance Estimator, which is pretrained on open neural architecture benchmarks, and aims to generalize to completely unseen architectures through combined innovations in network representation, contrastive pretraining, and fuzzy clustering-based predictor ensemble.

Contrastive Learning Image Classification +1

Profiling Neural Blocks and Design Spaces for Mobile Neural Architecture Search

1 code implementation25 Sep 2021 Keith G. Mills, Fred X. Han, Jialin Zhang, SEYED SAEED CHANGIZ REZAEI, Fabian Chudak, Wei Lu, Shuo Lian, Shangling Jui, Di Niu

Neural architecture search automates neural network design and has achieved state-of-the-art results in many deep learning applications.

Neural Architecture Search

L$^{2}$NAS: Learning to Optimize Neural Architectures via Continuous-Action Reinforcement Learning

no code implementations25 Sep 2021 Keith G. Mills, Fred X. Han, Mohammad Salameh, SEYED SAEED CHANGIZ REZAEI, Linglong Kong, Wei Lu, Shuo Lian, Shangling Jui, Di Niu

In this paper, we propose L$^{2}$NAS, which learns to intelligently optimize and update architecture hyperparameters via an actor neural network based on the distribution of high-performing architectures in the search history.

Hyperparameter Optimization Neural Architecture Search +2

Generative Adversarial Neural Architecture Search

no code implementations19 May 2021 SEYED SAEED CHANGIZ REZAEI, Fred X. Han, Di Niu, Mohammad Salameh, Keith Mills, Shuo Lian, Wei Lu, Shangling Jui

Despite the empirical success of neural architecture search (NAS) in deep learning applications, the optimality, reproducibility and cost of NAS schemes remain hard to assess.

Neural Architecture Search

Generative Adversarial Neural Architecture Search with Importance Sampling

no code implementations1 Jan 2021 SEYED SAEED CHANGIZ REZAEI, Fred X. Han, Di Niu, Mohammad Salameh, Keith G Mills, Shangling Jui

Despite the empirical success of neural architecture search (NAS) algorithms in deep learning applications, the optimality, reproducibility and cost of NAS schemes remain hard to be assessed.

Neural Architecture Search

Matching Natural Language Sentences with Hierarchical Sentence Factorization

no code implementations1 Mar 2018 Bang Liu, Ting Zhang, Fred X. Han, Di Niu, Kunfeng Lai, Yu Xu

The proposed sentence factorization technique leads to the invention of: 1) a new unsupervised distance metric which calculates the semantic distance between a pair of text snippets by solving a penalized optimal transport problem while preserving the logical relationship of words in the reordered sentences, and 2) new multi-scale deep learning models for supervised semantic training, based on factorized sentence hierarchies.

Paraphrase Identification Sentence

Cannot find the paper you are looking for? You can Submit a new open access paper.