You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

1 code implementation • 12 Jun 2021 • Mengmeng Li, Tobias Sutter, Daniel Kuhn

We study a stochastic program where the probability distribution of the uncertain problem parameters is unknown and only indirectly observed via finitely many correlated samples generated by an unknown Markov chain with $d$ states.

1 code implementation • 8 Jun 2021 • Tobias Sutter, Andreas Krause, Daniel Kuhn

Training models that perform well under distribution shifts is a central challenge in machine learning.

1 code implementation • 1 Jun 2021 • Bahar Taskesen, Man-Chung Yue, Jose Blanchet, Daniel Kuhn, Viet Anh Nguyen

Given available data, we investigate novel strategies to synthesize a family of least squares estimator experts that are robust with regard to moment conditions.

1 code implementation • 10 Mar 2021 • Bahar Taskesen, Soroosh Shafieezadeh-Abadeh, Daniel Kuhn

Semi-discrete optimal transport problems, which evaluate the Wasserstein distance between a discrete and a generic (possibly non-discrete) probability measure, are believed to be computationally hard.

no code implementations • 9 Mar 2021 • Wouter Jongeneel, Man-Chung Yue, Daniel Kuhn

We show that for the majority of zeroth order methods this smoothing parameter can however not be chosen arbitrarily small as numerical cancellation errors will dominate.

Optimization and Control

no code implementations • 5 Mar 2021 • Wouter Jongeneel, Tobias Sutter, Daniel Kuhn

Two dynamical systems are topologically equivalent when their phase-portraits can be morphed into each other by a homeomorphic coordinate transformation on the state space.

Optimization and Control

no code implementations • 9 Dec 2020 • Bahar Taskesen, Jose Blanchet, Daniel Kuhn, Viet Anh Nguyen

Leveraging the geometry of the feature space, the test statistic quantifies the distance of the empirical distribution supported on the test samples to the manifold of distributions that render a pre-trained classifier fair.

no code implementations • 18 Jul 2020 • Bahar Taskesen, Viet Anh Nguyen, Daniel Kuhn, Jose Blanchet

We propose a distributionally robust logistic regression model with an unfairness penalty that prevents discrimination with respect to sensitive attributes such as gender or ethnicity.

1 code implementation • 12 May 2020 • Dirk Lauinger, François Vuille, Daniel Kuhn

Vehicle-to-grid increases the low utilization rate of privately owned electric vehicles by making their batteries available to the electricity grid.

Optimization and Control

no code implementations • 15 Apr 2020 • Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann

In this technical note we prove that the Wasserstein ball is weakly compact under mild conditions, and we offer necessary and sufficient conditions for the existence of optimal solutions.

1 code implementation • 8 Nov 2019 • Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh, Daniel Kuhn, Peyman Mohajerin Esfahani

The proposed model can be viewed as a zero-sum game between a statistician choosing an estimator -- that is, a measurable function of the observation -- and a fictitious adversary choosing a prior -- that is, a pair of signal and noise distributions ranging over independent Wasserstein balls -- with the goal to minimize and maximize the expected squared estimation error, respectively.

1 code implementation • NeurIPS 2019 • Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh, Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann

The likelihood function is a fundamental component in Bayesian statistics.

1 code implementation • NeurIPS 2019 • Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh, Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann

A fundamental problem arising in many areas of machine learning is the evaluation of the likelihood of a given observation under different nominal distributions.

no code implementations • 23 Aug 2019 • Daniel Kuhn, Peyman Mohajerin Esfahani, Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh

The goal of data-driven decision-making is to learn a decision from finitely many training samples that will perform well on unseen test samples.

no code implementations • 7 Mar 2019 • Ekaterina Abramova, Luke Dickens, Daniel Kuhn, Aldo Faisal

We show that a small number of locally optimal linear controllers are able to solve global nonlinear control problems with unknown dynamics when combined with a reinforcement learner in this hierarchical framework.

1 code implementation • NeurIPS 2018 • Soroosh Shafieezadeh-Abadeh, Viet Anh Nguyen, Daniel Kuhn, Peyman Mohajerin Esfahani

Despite the non-convex nature of the ambiguity set, we prove that the estimation problem is equivalent to a tractable convex program.

no code implementations • 18 May 2018 • Viet Anh Nguyen, Daniel Kuhn, Peyman Mohajerin Esfahani

We introduce a distributionally robust maximum likelihood estimation model with a Wasserstein ambiguity set to infer the inverse covariance matrix of a $p$-dimensional Gaussian random vector from $n$ independent samples.

1 code implementation • 27 Oct 2017 • Soroosh Shafieezadeh-Abadeh, Daniel Kuhn, Peyman Mohajerin Esfahani

The goal of regression and classification methods in supervised learning is to minimize the empirical risk, that is, the expectation of some loss function quantifying the prediction error under the empirical distribution.

no code implementations • 22 May 2017 • Napat Rujeerapaiboon, Kilian Schindler, Daniel Kuhn, Wolfram Wiesemann

Plain vanilla K-means clustering has proven to be successful in practice, yet it suffers from outlier sensitivity and may produce highly unbalanced clusters.

no code implementations • NeurIPS 2015 • Soroosh Shafieezadeh-Abadeh, Peyman Mohajerin Esfahani, Daniel Kuhn

This paper proposes a distributionally robust approach to logistic regression.

no code implementations • NeurIPS 2013 • Grani Adiwena Hanasusanto, Daniel Kuhn

In stochastic optimal control the distribution of the exogenous noise is typically unknown and must be inferred from limited data before dynamic programming (DP)-based solution schemes can be applied.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.