Search Results for author: Jialiang Tang

Found 2 papers, 0 papers with code

Direct Distillation between Different Domains

no code implementations12 Jan 2024 Jialiang Tang, Shuo Chen, Gang Niu, Hongyuan Zhu, Joey Tianyi Zhou, Chen Gong, Masashi Sugiyama

Then, we build a fusion-activation mechanism to transfer the valuable domain-invariant knowledge to the student network, while simultaneously encouraging the adapter within the teacher network to learn the domain-specific knowledge of the target data.

Domain Adaptation Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.