Search Results for author: Xuguang Chen

Found 2 papers, 2 papers with code

Exploring the Impact of Dataset Bias on Dataset Distillation

1 code implementation24 Mar 2024 Yao Lu, Jianyang Gu, Xuguang Chen, Saeed Vahidian, Qi Xuan

Given that there are no suitable biased datasets for DD, we first construct two biased datasets, CMNIST-DD and CCIFAR10-DD, to establish a foundation for subsequent analysis.

Can pre-trained models assist in dataset distillation?

1 code implementation5 Oct 2023 Yao Lu, Xuguang Chen, Yuchen Zhang, Jianyang Gu, Tianle Zhang, Yifan Zhang, Xiaoniu Yang, Qi Xuan, Kai Wang, Yang You

Dataset Distillation (DD) is a prominent technique that encapsulates knowledge from a large-scale original dataset into a small synthetic dataset for efficient training.

Cannot find the paper you are looking for? You can Submit a new open access paper.