1 code implementation • NeurIPS 2020 • Linnan Wang, Rodrigo Fonseca, Yuandong Tian
If the nonlinear partition function and the local model fits well with ground-truth black-box function, then good partitions and candidates can be reached with much fewer samples.
2 code implementations • 11 Jun 2020 • Yiyang Zhao, Linnan Wang, Yuandong Tian, Rodrigo Fonseca, Tian Guo
supernet, to approximate the performance of every architecture in the search space via weight-sharing.
1 code implementation • 6 Mar 2020 • Mohammad Shahrad, Rodrigo Fonseca, Íñigo Goiri, Gohar Chaudhry, Paul Batum, Jason Cooke, Eduardo Laureano, Colby Tresness, Mark Russinovich, Ricardo Bianchini
Function as a Service (FaaS) has been gaining popularity as a way to deploy computations to serverless backends in the cloud.
Distributed, Parallel, and Cluster Computing
no code implementations • 25 Sep 2019 • Linnan Wang, Saining Xie, Teng Li, Rodrigo Fonseca, Yuandong Tian
As a result, using manually designed action space to perform NAS often leads to sample-inefficient explorations of architectures and thus can be sub-optimal.
1 code implementation • 17 Jun 2019 • Linnan Wang, Saining Xie, Teng Li, Rodrigo Fonseca, Yuandong Tian
To improve the sample efficiency, this paper proposes Latent Action Neural Architecture Search (LaNAS), which learns actions to recursively partition the search space into good or bad regions that contain networks with similar performance metrics.
1 code implementation • 26 Mar 2019 • Linnan Wang, Yiyang Zhao, Yuu Jinnai, Yuandong Tian, Rodrigo Fonseca
Neural Architecture Search (NAS) has shown great success in automating the design of neural networks, but the prohibitive amount of computations behind current NAS methods requires further investigations in improving the sample efficiency and the network evaluation cost to get better results in a shorter time.
Ranked #7 on
Neural Architecture Search
on CIFAR-10 Image Classification
(Params metric)
1 code implementation • 1 Jan 2019 • Linnan Wang, Saining Xie, Teng Li, Rodrigo Fonseca, Yuandong Tian
To improve the sample efficiency, this paper proposes Latent Action Neural Architecture Search (LaNAS), which learns actions to recursively partition the search space into good or bad regions that contain networks with similar performance metrics.
Ranked #15 on
Image Classification
on CIFAR-10
2 code implementations • 18 May 2018 • Linnan Wang, Yiyang Zhao, Yuu Jinnai, Yuandong Tian, Rodrigo Fonseca
Neural Architecture Search (NAS) has shown great success in automating the design of neural networks, but the prohibitive amount of computations behind current NAS methods requires further investigations in improving the sample efficiency and the network evaluation cost to get better results in a shorter time.
no code implementations • 30 Jan 2018 • Alex Galakatos, Michael Markovitch, Carsten Binnig, Rodrigo Fonseca, Tim Kraska
At the core of our index is a tunable error parameter that allows a DBA to balance lookup performance and space consumption.
Databases