Search Results for author: Jing Bo Yang

Found 1 papers, 0 papers with code

MoCo-Pretraining Improves Representations and Transferability of Chest X-ray Models

no code implementations1 Jan 2021 Hari Sowrirajan, Jing Bo Yang, Andrew Y. Ng, Pranav Rajpurkar

Using 0. 1% of labeled training data, we find that a linear model trained on MoCo-pretrained representations outperforms one trained on representations without MoCo-pretraining by an AUC of 0. 096 (95% CI 0. 061, 0. 130), indicating that MoCo-pretrained representations are of higher quality.

Image Classification Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.