Search Results for author: Fotis Iliopoulos

Found 5 papers, 0 papers with code

Linear Projections of Teacher Embeddings for Few-Class Distillation

no code implementations30 Sep 2024 Noel Loo, Fotis Iliopoulos, Wei Hu, Erik Vee

Knowledge Distillation (KD) has emerged as a promising approach for transferring knowledge from a larger, more complex teacher model to a smaller student model.

Binary Classification Knowledge Distillation +1

SLaM: Student-Label Mixing for Distillation with Unlabeled Examples

no code implementations NeurIPS 2023 Vasilis Kontonis, Fotis Iliopoulos, Khoa Trinh, Cenk Baykal, Gaurav Menghani, Erik Vee

Knowledge distillation with unlabeled examples is a powerful training paradigm for generating compact and lightweight student models in applications where the amount of labeled data is limited but one has access to a large pool of unlabeled data.

Knowledge Distillation

Weighted Distillation with Unlabeled Examples

no code implementations13 Oct 2022 Fotis Iliopoulos, Vasilis Kontonis, Cenk Baykal, Gaurav Menghani, Khoa Trinh, Erik Vee

Our method is hyper-parameter free, data-agnostic, and simple to implement.

Robust Active Distillation

no code implementations3 Oct 2022 Cenk Baykal, Khoa Trinh, Fotis Iliopoulos, Gaurav Menghani, Erik Vee

Distilling knowledge from a large teacher model to a lightweight one is a widely successful approach for generating compact, powerful models in the semi-supervised learning setting where a limited amount of labeled data is available.

Active Learning Informativeness +1

Cannot find the paper you are looking for? You can Submit a new open access paper.