1 code implementation • 10 Mar 2021 • Bahar Taskesen, Soroosh Shafieezadeh-Abadeh, Daniel Kuhn
Semi-discrete optimal transport problems, which evaluate the Wasserstein distance between a discrete and a generic (possibly non-discrete) probability measure, are believed to be computationally hard.
1 code implementation • 1 Jun 2021 • Bahar Taskesen, Man-Chung Yue, Jose Blanchet, Daniel Kuhn, Viet Anh Nguyen
Given available data, we investigate novel strategies to synthesize a family of least squares estimator experts that are robust with regard to moment conditions.
1 code implementation • 30 May 2022 • Yves Rychener, Bahar Taskesen, Daniel Kuhn
This means that the distributions of the predictions within the two groups should be close with respect to the Kolmogorov distance, and fairness is achieved by penalizing the dissimilarity of these two distributions in the objective function of the learning problem.
no code implementations • 18 Jul 2020 • Bahar Taskesen, Viet Anh Nguyen, Daniel Kuhn, Jose Blanchet
We propose a distributionally robust logistic regression model with an unfairness penalty that prevents discrimination with respect to sensitive attributes such as gender or ethnicity.
no code implementations • 9 Dec 2020 • Bahar Taskesen, Jose Blanchet, Daniel Kuhn, Viet Anh Nguyen
Leveraging the geometry of the feature space, the test statistic quantifies the distance of the empirical distribution supported on the test samples to the manifold of distributions that render a pre-trained classifier fair.
no code implementations • 10 Aug 2023 • Jose Blanchet, Daniel Kuhn, Jiajin Li, Bahar Taskesen
In the past few years, there has been considerable interest in two prominent approaches for Distributionally Robust Optimization (DRO): Divergence-based and Wasserstein-based methods.