Neural Architecture Search
774 papers with code • 26 benchmarks • 27 datasets
Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.
Image Credit : NAS with Reinforcement Learning
Libraries
Use these libraries to find Neural Architecture Search models and implementationsDatasets
Most implemented papers
Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4 Hours
Can we automatically design a Convolutional Network (ConvNet) with the highest image classification accuracy under the runtime constraint of a mobile device?
ISyNet: Convolutional Neural Networks design for AI accelerator
To address this problem we propose a measure of hardware efficiency of neural architecture search space - matrix efficiency measure (MEM); a search space comprising of hardware-efficient operations; a latency-aware scaling method; and ISyNet - a set of architectures designed to be fast on the specialized neural processing unit (NPU) hardware and accurate at the same time.
NAS-FPN: Learning Scalable Feature Pyramid Architecture for Object Detection
Here we aim to learn a better architecture of feature pyramid network for object detection.
PC-DARTS: Partial Channel Connections for Memory-Efficient Architecture Search
Differentiable architecture search (DARTS) provided a fast solution in finding effective network architectures, but suffered from large memory and computing overheads in jointly training a super-network and searching for an optimal architecture.
AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data
We introduce AutoGluon-Tabular, an open-source AutoML framework that requires only a single line of Python to train highly accurate machine learning models on an unprocessed tabular dataset such as a CSV file.
A Downsampled Variant of ImageNet as an Alternative to the CIFAR datasets
The original ImageNet dataset is a popular large-scale benchmark for training Deep Neural Networks.
ReNAS: Relativistic Evaluation of Neural Architecture Search
An effective and efficient architecture performance evaluation scheme is essential for the success of Neural Architecture Search (NAS).
Single Path One-Shot Neural Architecture Search with Uniform Sampling
It is easy to train and fast to search.
Searching for A Robust Neural Architecture in Four GPU Hours
To avoid traversing all the possibilities of the sub-graphs, we develop a differentiable sampler over the DAG.
HourNAS: Extremely Fast Neural Architecture Search Through an Hourglass Lens
To achieve an extremely fast NAS while preserving the high accuracy, we propose to identify the vital blocks and make them the priority in the architecture search.