Search Results for author: Eda Yilmaz

Found 1 papers, 0 papers with code

Adversarial Sparse Teacher: Defense Against Distillation-Based Model Stealing Attacks Using Adversarial Examples

no code implementations8 Mar 2024 Eda Yilmaz, Hacer Yalim Keles

Knowledge Distillation (KD) facilitates the transfer of discriminative capabilities from an advanced teacher model to a simpler student model, ensuring performance enhancement without compromising accuracy.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.