Search Results for author: Li Gu

Found 10 papers, 7 papers with code

Distribution Alignment for Fully Test-Time Adaptation with Dynamic Online Data Streams

1 code implementation16 Jul 2024 Ziqiang Wang, Zhixiang Chi, Yanan Wu, Li Gu, Zhi Liu, Konstantinos Plataniotis, Yang Wang

Given a model trained on source data, Test-Time Adaptation (TTA) enables adaptation and inference in test data streams with domain shifts from the source.

Test-time Adaptation

Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts

1 code implementation8 Oct 2022 Tao Zhong, Zhixiang Chi, Li Gu, Yang Wang, Yuanhao Yu, Jin Tang

Most existing methods perform training on multiple source domains using a single model, and the same trained model is used on all unseen target domains.

Domain Generalization Knowledge Distillation +3

MetaFSCIL: A Meta-Learning Approach for Few-Shot Class Incremental Learning

no code implementations CVPR 2022 Zhixiang Chi, Li Gu, Huan Liu, Yang Wang, Yuanhao Yu, Jin Tang

The learning objective of these methods is often hand-engineered and is not directly tied to the objective (i. e. incrementally learning new classes) during testing.

class-incremental learning Few-Shot Class-Incremental Learning +2

Distilling the Posterior in Bayesian Neural Networks

no code implementations ICML 2018 Kuan-Chieh Wang, Paul Vicol, James Lucas, Li Gu, Roger Grosse, Richard Zemel

We propose a framework, Adversarial Posterior Distillation, to distill the SGLD samples using a Generative Adversarial Network (GAN).

Active Learning Anomaly Detection +1

Adversarial Distillation of Bayesian Neural Network Posteriors

1 code implementation27 Jun 2018 Kuan-Chieh Wang, Paul Vicol, James Lucas, Li Gu, Roger Grosse, Richard Zemel

We propose a framework, Adversarial Posterior Distillation, to distill the SGLD samples using a Generative Adversarial Network (GAN).

Active Learning Anomaly Detection +1

Cannot find the paper you are looking for? You can Submit a new open access paper.