Search Results for author: Nazanin Sepahvand

Found 1 papers, 0 papers with code

BD-KD: Balancing the Divergences for Online Knowledge Distillation

no code implementations25 Dec 2022 Ibtihel Amara, Nazanin Sepahvand, Brett H. Meyer, Warren J. Gross, James J. Clark

We show that adaptively balancing between the reverse and forward divergences shifts the focus of the training strategy to the compact student network without limiting the teacher network's learning process.

Knowledge Distillation Model Compression +1

Cannot find the paper you are looking for? You can Submit a new open access paper.