Search Results for author: Daniel Kuhn

Found 21 papers, 10 papers with code

Distributionally Robust Optimization with Markovian Data

1 code implementation12 Jun 2021 Mengmeng Li, Tobias Sutter, Daniel Kuhn

We study a stochastic program where the probability distribution of the uncertain problem parameters is unknown and only indirectly observed via finitely many correlated samples generated by an unknown Markov chain with $d$ states.

Dimensionality Reduction

Robust Generalization despite Distribution Shift via Minimum Discriminating Information

1 code implementation8 Jun 2021 Tobias Sutter, Andreas Krause, Daniel Kuhn

Training models that perform well under distribution shifts is a central challenge in machine learning.

Generalization Bounds

Sequential Domain Adaptation by Synthesizing Distributionally Robust Experts

1 code implementation1 Jun 2021 Bahar Taskesen, Man-Chung Yue, Jose Blanchet, Daniel Kuhn, Viet Anh Nguyen

Given available data, we investigate novel strategies to synthesize a family of least squares estimator experts that are robust with regard to moment conditions.

Domain Adaptation

Semi-Discrete Optimal Transport: Hardness, Regularization and Numerical Solution

1 code implementation10 Mar 2021 Bahar Taskesen, Soroosh Shafieezadeh-Abadeh, Daniel Kuhn

Semi-discrete optimal transport problems, which evaluate the Wasserstein distance between a discrete and a generic (possibly non-discrete) probability measure, are believed to be computationally hard.

Small Errors in Random Zeroth Order Optimization are Imaginary

no code implementations9 Mar 2021 Wouter Jongeneel, Man-Chung Yue, Daniel Kuhn

We show that for the majority of zeroth order methods this smoothing parameter can however not be chosen arbitrarily small as numerical cancellation errors will dominate.

Optimization and Control

Topological Linear System Identification via Moderate Deviations Theory

no code implementations5 Mar 2021 Wouter Jongeneel, Tobias Sutter, Daniel Kuhn

Two dynamical systems are topologically equivalent when their phase-portraits can be morphed into each other by a homeomorphic coordinate transformation on the state space.

Optimization and Control

A Statistical Test for Probabilistic Fairness

no code implementations9 Dec 2020 Bahar Taskesen, Jose Blanchet, Daniel Kuhn, Viet Anh Nguyen

Leveraging the geometry of the feature space, the test statistic quantifies the distance of the empirical distribution supported on the test samples to the manifold of distributions that render a pre-trained classifier fair.

Fairness

A Distributionally Robust Approach to Fair Classification

no code implementations18 Jul 2020 Bahar Taskesen, Viet Anh Nguyen, Daniel Kuhn, Jose Blanchet

We propose a distributionally robust logistic regression model with an unfairness penalty that prevents discrimination with respect to sensitive attributes such as gender or ethnicity.

Classification Fairness +1

Reliable Frequency Regulation through Vehicle-to-Grid: From EU Legislation to Robust Optimization

1 code implementation12 May 2020 Dirk Lauinger, François Vuille, Daniel Kuhn

Vehicle-to-grid increases the low utilization rate of privately owned electric vehicles by making their batteries available to the electricity grid.

Optimization and Control

On Linear Optimization over Wasserstein Balls

no code implementations15 Apr 2020 Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann

In this technical note we prove that the Wasserstein ball is weakly compact under mild conditions, and we offer necessary and sufficient conditions for the existence of optimal solutions.

Bridging Bayesian and Minimax Mean Square Error Estimation via Wasserstein Distributionally Robust Optimization

1 code implementation8 Nov 2019 Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh, Daniel Kuhn, Peyman Mohajerin Esfahani

The proposed model can be viewed as a zero-sum game between a statistician choosing an estimator -- that is, a measurable function of the observation -- and a fictitious adversary choosing a prior -- that is, a pair of signal and noise distributions ranging over independent Wasserstein balls -- with the goal to minimize and maximize the expected squared estimation error, respectively.

Calculating Optimistic Likelihoods Using (Geodesically) Convex Optimization

1 code implementation NeurIPS 2019 Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh, Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann

A fundamental problem arising in many areas of machine learning is the evaluation of the likelihood of a given observation under different nominal distributions.

Wasserstein Distributionally Robust Optimization: Theory and Applications in Machine Learning

no code implementations23 Aug 2019 Daniel Kuhn, Peyman Mohajerin Esfahani, Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh

The goal of data-driven decision-making is to learn a decision from finitely many training samples that will perform well on unseen test samples.

Decision Making

RLOC: Neurobiologically Inspired Hierarchical Reinforcement Learning Algorithm for Continuous Control of Nonlinear Dynamical Systems

no code implementations7 Mar 2019 Ekaterina Abramova, Luke Dickens, Daniel Kuhn, Aldo Faisal

We show that a small number of locally optimal linear controllers are able to solve global nonlinear control problems with unknown dynamics when combined with a reinforcement learner in this hierarchical framework.

Continuous Control Hierarchical Reinforcement Learning +1

Wasserstein Distributionally Robust Kalman Filtering

1 code implementation NeurIPS 2018 Soroosh Shafieezadeh-Abadeh, Viet Anh Nguyen, Daniel Kuhn, Peyman Mohajerin Esfahani

Despite the non-convex nature of the ambiguity set, we prove that the estimation problem is equivalent to a tractable convex program.

Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator

no code implementations18 May 2018 Viet Anh Nguyen, Daniel Kuhn, Peyman Mohajerin Esfahani

We introduce a distributionally robust maximum likelihood estimation model with a Wasserstein ambiguity set to infer the inverse covariance matrix of a $p$-dimensional Gaussian random vector from $n$ independent samples.

Regularization via Mass Transportation

1 code implementation27 Oct 2017 Soroosh Shafieezadeh-Abadeh, Daniel Kuhn, Peyman Mohajerin Esfahani

The goal of regression and classification methods in supervised learning is to minimize the empirical risk, that is, the expectation of some loss function quantifying the prediction error under the empirical distribution.

Generalization Bounds

Size Matters: Cardinality-Constrained Clustering and Outlier Detection via Conic Optimization

no code implementations22 May 2017 Napat Rujeerapaiboon, Kilian Schindler, Daniel Kuhn, Wolfram Wiesemann

Plain vanilla K-means clustering has proven to be successful in practice, yet it suffers from outlier sensitivity and may produce highly unbalanced clusters.

Outlier Detection

Robust Data-Driven Dynamic Programming

no code implementations NeurIPS 2013 Grani Adiwena Hanasusanto, Daniel Kuhn

In stochastic optimal control the distribution of the exogenous noise is typically unknown and must be inferred from limited data before dynamic programming (DP)-based solution schemes can be applied.

Cannot find the paper you are looking for? You can Submit a new open access paper.