Search Results for author: Tongtong Fang

Found 2 papers, 1 papers with code

Rethinking Importance Weighting for Deep Learning under Distribution Shift

1 code implementation NeurIPS 2020 Tongtong Fang, Nan Lu, Gang Niu, Masashi Sugiyama

Under distribution shift (DS) where the training data distribution differs from the test one, a powerful technique is importance weighting (IW) which handles DS in two separate steps: weight estimation (WE) estimates the test-over-training density ratio and weighted classification (WC) trains the classifier from weighted training data.

Rethinking Importance Weighting for Transfer Learning

no code implementations19 Dec 2021 Nan Lu, Tianyi Zhang, Tongtong Fang, Takeshi Teshima, Masashi Sugiyama

A key assumption in supervised learning is that training and test data follow the same probability distribution.

Selection bias Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.