Search Results for author: Yihao Xue

Found 9 papers, 2 papers with code

Investigating the Benefits of Projection Head for Representation Learning

no code implementations18 Mar 2024 Yihao Xue, Eric Gan, Jiayi Ni, Siddharth Joshi, Baharan Mirzasoleiman

An effective technique for obtaining high-quality representations is adding a projection head on top of the encoder during training, then discarding it and using the pre-projection representations.

Contrastive Learning Data Augmentation +1

Understanding the Robustness of Multi-modal Contrastive Learning to Distribution Shift

no code implementations8 Oct 2023 Yihao Xue, Siddharth Joshi, Dang Nguyen, Baharan Mirzasoleiman

Recently, multimodal contrastive learning (MMCL) approaches, such as CLIP, have achieved a remarkable success in learning representations that are robust against distribution shift and generalize to new domains.

Contrastive Learning Zero-Shot Learning

Towards Mitigating Spurious Correlations in the Wild: A Benchmark and a more Realistic Dataset

1 code implementation21 Jun 2023 Siddharth Joshi, Yu Yang, Yihao Xue, Wenhan Yang, Baharan Mirzasoleiman

Deep neural networks often exploit non-predictive features that are spuriously correlated with class labels, leading to poor performance on groups of examples without such features.

Which Features are Learnt by Contrastive Learning? On the Role of Simplicity Bias in Class Collapse and Feature Suppression

no code implementations25 May 2023 Yihao Xue, Siddharth Joshi, Eric Gan, Pin-Yu Chen, Baharan Mirzasoleiman

However, supervised CL is prone to collapsing representations of subclasses within a class by not capturing all their features, and unsupervised CL may suppress harder class-relevant features by focusing on learning easy class-irrelevant features; both significantly compromise representation quality.

Contrastive Learning Representation Learning

Few-shot Adaption to Distribution Shifts By Mixing Source and Target Embeddings

no code implementations23 May 2023 Yihao Xue, Ali Payani, Yu Yang, Baharan Mirzasoleiman

Pretrained machine learning models need to be adapted to distribution shifts when deployed in new target environments.

Investigating Why Contrastive Learning Benefits Robustness Against Label Noise

no code implementations29 Jan 2022 Yihao Xue, Kyle Whitecross, Baharan Mirzasoleiman

Self-supervised Contrastive Learning (CL) has been recently shown to be very effective in preventing deep networks from overfitting noisy labels.

Contrastive Learning

Toward Understanding the Influence of Individual Clients in Federated Learning

no code implementations20 Dec 2020 Yihao Xue, Chaoyue Niu, Zhenzhe Zheng, Shaojie Tang, Chengfei Lv, Fan Wu, Guihai Chen

Federated learning allows mobile clients to jointly train a global model without sending their private data to a central server.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.