Search Results for author: Yoav Wald

Found 8 papers, 2 papers with code

Data Augmentations for Improved (Large) Language Model Generalization

no code implementations NeurIPS 2023 Amir Feder, Yoav Wald, Claudia Shi, Suchi Saria, David Blei

The reliance of text classifiers on spurious correlations can lead to poor generalization at deployment, raising concerns about their use in safety-critical domains such as healthcare.

Attribute counterfactual +3

Don't blame Dataset Shift! Shortcut Learning due to Gradients and Cross Entropy

no code implementations24 Aug 2023 Aahlad Puli, Lily Zhang, Yoav Wald, Rajesh Ranganath

However, even when the stable feature determines the label in the training distribution and the shortcut does not provide any additional information, like in perception tasks, default-ERM still exhibits shortcut learning.

Inductive Bias

Malign Overfitting: Interpolation Can Provably Preclude Invariance

no code implementations28 Nov 2022 Yoav Wald, Gal Yona, Uri Shalit, Yair Carmon

This suggests that the phenomenon of ``benign overfitting," in which models generalize well despite interpolating, might not favorably extend to settings in which robustness or fairness are desirable.

Fairness Out-of-Distribution Generalization

In the Eye of the Beholder: Robust Prediction with Causal User Modeling

no code implementations1 Jun 2022 Amir Feder, Guy Horowitz, Yoav Wald, Roi Reichart, Nir Rosenfeld

Accurately predicting the relevance of items to users is crucial to the success of many social platforms.

Recommendation Systems

On Calibration and Out-of-domain Generalization

no code implementations NeurIPS 2021 Yoav Wald, Amir Feder, Daniel Greenfeld, Uri Shalit

In this work, we draw a link between OOD performance and model calibration, arguing that calibration across multiple domains can be viewed as a special case of an invariant representation leading to better OOD generalization.

Domain Generalization

Globally Optimal Learning for Structured Elliptical Losses

1 code implementation NeurIPS 2019 Yoav Wald, Nofar Noy, Gal Elidan, Ami Wiesel

The core of the difficulty is the non-convexity of the objective function, implying that standard optimization algorithms may converge to sub-optimal critical points.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.