Search Results for author: Yang Kang

Found 7 papers, 0 papers with code

Machine Learning's Dropout Training is Distributionally Robust Optimal

no code implementations13 Sep 2020 Jose Blanchet, Yang Kang, Jose Luis Montiel Olea, Viet Anh Nguyen, Xuhui Zhang

This paper shows that dropout training in Generalized Linear Models is the minimax solution of a two-player, zero-sum game where an adversarial nature corrupts a statistician's covariates using a multiplicative nonparametric errors-in-variables model.

A Distributionally Robust Boosting Algorithm

no code implementations20 May 2019 Jose Blanchet, Yang Kang, Fan Zhang, Zhangyi Hu

Distributionally Robust Optimization (DRO) has been shown to provide a flexible framework for decision making under uncertainty and statistical estimation.

Decision Making Decision Making Under Uncertainty

Data-driven Optimal Cost Selection for Distributionally Robust Optimization

no code implementations19 May 2017 Jose Blanchet, Yang Kang, Fan Zhang, Karthyek Murthy

Recently, (Blanchet, Kang, and Murhy 2016, and Blanchet, and Kang 2017) showed that several machine learning algorithms, such as square-root Lasso, Support Vector Machines, and regularized logistic regression, among many others, can be represented exactly as distributionally robust optimization (DRO) problems.

BIG-bench Machine Learning regression

Doubly Robust Data-Driven Distributionally Robust Optimization

no code implementations19 May 2017 Jose Blanchet, Yang Kang, Fan Zhang, Fei He, Zhangyi Hu

Data-driven Distributionally Robust Optimization (DD-DRO) via optimal transport has been shown to encompass a wide range of popular machine learning algorithms.

Semi-supervised Learning based on Distributionally Robust Optimization

no code implementations28 Feb 2017 Jose Blanchet, Yang Kang

We propose a novel method for semi-supervised learning (SSL) based on data-driven distributionally robust optimization (DRO) using optimal transport metrics.

Dimensionality Reduction

Cannot find the paper you are looking for? You can Submit a new open access paper.