Search Results for author: Ethan Sterling

Found 2 papers, 1 papers with code

Distilling Interpretable Models into Human-Readable Code

1 code implementation21 Jan 2021 Walker Ravina, Ethan Sterling, Olexiy Oryeshko, Nathan Bell, Honglei Zhuang, Xuanhui Wang, Yonghui Wu, Alexander Grushetsky

The goal of model distillation is to faithfully transfer teacher model knowledge to a model which is faster, more generalizable, more interpretable, or possesses other desirable characteristics.

Cannot find the paper you are looking for? You can Submit a new open access paper.