no code implementations • 20 Feb 2025 • Yihao Xue, Kristjan Greenewald, Youssef Mroueh, Baharan Mirzasoleiman
We empirically study these techniques and show that they achieve performance close to that of a supervised (still black-box) oracle, suggesting little room for improvement within this paradigm.
no code implementations • 2 Feb 2025 • Yihao Xue, Jiping Li, Baharan Mirzasoleiman
While recent work has offered theoretical insights into this phenomenon, a clear understanding of the interactions between weak and strong models that drive W2SG remains elusive.
1 code implementation • Control Engineering Practice 2024 • Xiaohan Chen, Rui Yang, Yihao Xue, Baoye Song, Zidong Wang
Recent advances in intelligent rotating machinery fault diagnosis have been enabled by the availability of massive labeled training data.
no code implementations • 18 Mar 2024 • Yihao Xue, Eric Gan, Jiayi Ni, Siddharth Joshi, Baharan Mirzasoleiman
An effective technique for obtaining high-quality representations is adding a projection head on top of the encoder during training, then discarding it and using the pre-projection representations.
no code implementations • 8 Oct 2023 • Yihao Xue, Siddharth Joshi, Dang Nguyen, Baharan Mirzasoleiman
Recently, multimodal contrastive learning (MMCL) approaches, such as CLIP, have achieved a remarkable success in learning representations that are robust against distribution shift and generalize to new domains.
1 code implementation • 21 Jun 2023 • Siddharth Joshi, Yu Yang, Yihao Xue, Wenhan Yang, Baharan Mirzasoleiman
Through this, we highlight how existing group inference methods struggle in the presence of spurious features that are learned later in training.
no code implementations • 25 May 2023 • Yihao Xue, Siddharth Joshi, Eric Gan, Pin-Yu Chen, Baharan Mirzasoleiman
However, supervised CL is prone to collapsing representations of subclasses within a class by not capturing all their features, and unsupervised CL may suppress harder class-relevant features by focusing on learning easy class-irrelevant features; both significantly compromise representation quality.
no code implementations • 23 May 2023 • Yihao Xue, Ali Payani, Yu Yang, Baharan Mirzasoleiman
Pretrained machine learning models need to be adapted to distribution shifts when deployed in new target environments.
no code implementations • 17 Aug 2022 • Yihao Xue, Kyle Whitecross, Baharan Mirzasoleiman
However, the effect of label noise on the test loss curve has not been fully explored.
no code implementations • 29 Jan 2022 • Yihao Xue, Kyle Whitecross, Baharan Mirzasoleiman
Self-supervised Contrastive Learning (CL) has been recently shown to be very effective in preventing deep networks from overfitting noisy labels.
no code implementations • 20 Dec 2020 • Yihao Xue, Chaoyue Niu, Zhenzhe Zheng, Shaojie Tang, Chengfei Lv, Fan Wu, Guihai Chen
Federated learning allows mobile clients to jointly train a global model without sending their private data to a central server.