Methods > Computer Vision > Convolutional Neural Networks

AmoebaNet is a convolutional neural network found through regularized evolution architecture search. The search space is NASNet, which specifies a space of image classifiers with a fixed outer structure: a feed-forward stack of Inception-like modules called cells. The discovered architecture is shown to the right.

Source: Regularized Evolution for Image Classifier Architecture Search

Latest Papers

PAPER DATE
RelativeNAS: Relative Neural Architecture Search via Slow-Fast Learning
| Hao TanRan ChengShihua HuangCheng HeChangxiao QiuFan YangPing Luo
2020-09-14
GDP: Generalized Device Placement for Dataflow Graphs
Yanqi ZhouSudip RoyAmirali AbdolrashidiDaniel WongPeter C. MaQiumin XuMing ZhongHanxiao LiuAnna GoldieAzalia MirhoseiniJames Laudon
2019-09-28
Learning Data Augmentation Strategies for Object Detection
| Barret ZophEkin D. CubukGolnaz GhiasiTsung-Yi LinJonathon ShlensQuoc V. Le
2019-06-26
NAS-FPN: Learning Scalable Feature Pyramid Architecture for Object Detection
| Golnaz GhiasiTsung-Yi LinRuoming PangQuoc V. Le
2019-04-16
GPipe: Efficient Training of Giant Neural Networks using Pipeline Parallelism
| Yanping HuangYoulong ChengAnkur BapnaOrhan FiratMia Xu ChenDehao ChenHyoukJoong LeeJiquan NgiamQuoc V. LeYonghui WuZhifeng Chen
2018-11-16
Regularized Evolution for Image Classifier Architecture Search
| Esteban RealAlok AggarwalYanping HuangQuoc V. Le
2018-02-05

Categories