Search Results for author: Li Gu

Found 7 papers, 5 papers with code

Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts

1 code implementation8 Oct 2022 Tao Zhong, Zhixiang Chi, Li Gu, Yang Wang, Yuanhao Yu, Jin Tang

Most existing methods perform training on multiple source domains using a single model, and the same trained model is used on all unseen target domains.

Domain Generalization Knowledge Distillation +3

MetaFSCIL: A Meta-Learning Approach for Few-Shot Class Incremental Learning

no code implementations CVPR 2022 Zhixiang Chi, Li Gu, Huan Liu, Yang Wang, Yuanhao Yu, Jin Tang

The learning objective of these methods is often hand-engineered and is not directly tied to the objective (i. e. incrementally learning new classes) during testing.

Few-Shot Class-Incremental Learning Incremental Learning +1

Distilling the Posterior in Bayesian Neural Networks

no code implementations ICML 2018 Kuan-Chieh Wang, Paul Vicol, James Lucas, Li Gu, Roger Grosse, Richard Zemel

We propose a framework, Adversarial Posterior Distillation, to distill the SGLD samples using a Generative Adversarial Network (GAN).

Active Learning Anomaly Detection +1

Adversarial Distillation of Bayesian Neural Network Posteriors

1 code implementation27 Jun 2018 Kuan-Chieh Wang, Paul Vicol, James Lucas, Li Gu, Roger Grosse, Richard Zemel

We propose a framework, Adversarial Posterior Distillation, to distill the SGLD samples using a Generative Adversarial Network (GAN).

Active Learning Anomaly Detection +1

Cannot find the paper you are looking for? You can Submit a new open access paper.