Search Results for author: Max Moroz

Found 2 papers, 1 papers with code

Neighbourhood Distillation: On the benefits of non end-to-end distillation

no code implementations2 Oct 2020 Laëtitia Shao, Max Moroz, Elad Eban, Yair Movshovitz-Attias

Instead of distilling a model end-to-end, we propose to split it into smaller sub-networks - also called neighbourhoods - that are then trained independently.

Knowledge Distillation Neural Architecture Search

Fine-Grained Stochastic Architecture Search

1 code implementation17 Jun 2020 Shraman Ray Chaudhuri, Elad Eban, Hanhan Li, Max Moroz, Yair Movshovitz-Attias

Mobile neural architecture search (NAS) methods automate the design of small models but state-of-the-art NAS methods are expensive to run.

Neural Architecture Search object-detection +1

Cannot find the paper you are looking for? You can Submit a new open access paper.