Neural Architecture Search

774 papers with code • 26 benchmarks • 27 datasets

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Libraries

Use these libraries to find Neural Architecture Search models and implementations
10 papers
2,918
6 papers
1,546
See all 24 libraries.

Most implemented papers

Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4 Hours

dstamoulis/single-path-nas 5 Apr 2019

Can we automatically design a Convolutional Network (ConvNet) with the highest image classification accuracy under the runtime constraint of a mobile device?

ISyNet: Convolutional Neural Networks design for AI accelerator

mindspore-ai/models 4 Sep 2021

To address this problem we propose a measure of hardware efficiency of neural architecture search space - matrix efficiency measure (MEM); a search space comprising of hardware-efficient operations; a latency-aware scaling method; and ISyNet - a set of architectures designed to be fast on the specialized neural processing unit (NPU) hardware and accurate at the same time.

NAS-FPN: Learning Scalable Feature Pyramid Architecture for Object Detection

open-mmlab/mmdetection CVPR 2019

Here we aim to learn a better architecture of feature pyramid network for object detection.

PC-DARTS: Partial Channel Connections for Memory-Efficient Architecture Search

yuhuixu1993/PC-DARTS ICLR 2020

Differentiable architecture search (DARTS) provided a fast solution in finding effective network architectures, but suffered from large memory and computing overheads in jointly training a super-network and searching for an optimal architecture.

AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data

awslabs/autogluon 13 Mar 2020

We introduce AutoGluon-Tabular, an open-source AutoML framework that requires only a single line of Python to train highly accurate machine learning models on an unprocessed tabular dataset such as a CSV file.

A Downsampled Variant of ImageNet as an Alternative to the CIFAR datasets

PatrykChrabaszcz/Imagenet32_Scripts 27 Jul 2017

The original ImageNet dataset is a popular large-scale benchmark for training Deep Neural Networks.

ReNAS: Relativistic Evaluation of Neural Architecture Search

huawei-noah/Efficient-Computing CVPR 2021

An effective and efficient architecture performance evaluation scheme is essential for the success of Neural Architecture Search (NAS).

Searching for A Robust Neural Architecture in Four GPU Hours

D-X-Y/NAS-Projects CVPR 2019

To avoid traversing all the possibilities of the sub-graphs, we develop a differentiable sampler over the DAG.

HourNAS: Extremely Fast Neural Architecture Search Through an Hourglass Lens

mindspore-ai/models CVPR 2021

To achieve an extremely fast NAS while preserving the high accuracy, we propose to identify the vital blocks and make them the priority in the architecture search.