no code implementations • 30 Aug 2023 • Daoli Zhu, Lei Zhao, Shuzhong Zhang
In this paper we propose a proximal subgradient method (Prox-SubGrad) for solving nonconvex and nonsmooth optimization problems without assuming Lipschitz continuity conditions.
no code implementations • 2 Jun 2023 • Lei Zhao, Daoli Zhu, Shuzhong Zhang
Under an assumption, to be called the primal-dual variational coherence condition in the paper, we prove the convergence of ALAVI.
no code implementations • 2 Sep 2022 • Casey Garner, Gilad Lerman, Shuzhong Zhang
Matrix functions are utilized to rewrite smooth spectral constrained matrix optimization problems as smooth unconstrained problems over the set of symmetric matrices which are then solved via the cubic-regularized Newton method.
no code implementations • 30 Oct 2020 • Derek Singh, Shuzhong Zhang
This paper expands the work on distributionally robust newsvendor to incorporate moment constraints.
no code implementations • 29 Jun 2020 • Wenye Li, Shuzhong Zhang
Random projection is often used to project higher-dimensional vectors onto a lower-dimensional space, while approximately preserving their pairwise distances.
no code implementations • 18 Jun 2020 • Derek Singh, Shuzhong Zhang
This paper expands the notion of robust profit opportunities in financial markets to incorporate distributional uncertainty using Wasserstein distance as the ambiguity measure.
no code implementations • 20 Apr 2020 • Derek Singh, Shuzhong Zhang
A relaxation is introduced for which we coin the term statistical arbitrage.
no code implementations • 4 Oct 2019 • Derek Singh, Shuzhong Zhang
This paper investigates calculations of robust XVA, in particular, credit valuation adjustment (CVA) and funding valuation adjustment (FVA) for over-the-counter derivatives under distributional uncertainty using Wasserstein distance as the ambiguity measure.
1 code implementation • 31 May 2018 • Tianyi Lin, Shiqian Ma, Yinyu Ye, Shuzhong Zhang
Due its connection to Newton's method, IPM is often classified as second-order method -- a genre that is attached with stability and accuracy at the expense of scalability.
Optimization and Control
no code implementations • 6 Feb 2018 • Jason Causey, Junyu Zhang, Shiqian Ma, Bo Jiang, Jake Qualls, David G. Politte, Fred Prior, Shuzhong Zhang, Xiuzhen Huang
Here we present NoduleX, a systematic approach to predict lung nodule malignancy from CT data, based on deep learning convolutional neural networks (CNN).
no code implementations • 5 Oct 2017 • Junyu Zhang, Shiqian Ma, Shuzhong Zhang
For prohibitively large-size tensor or machine learning models, we present a sampling-based stochastic algorithm with the same iteration complexity bound in expectation.
no code implementations • 17 Feb 2017 • Yangyang Xu, Shuzhong Zhang
We show that the rate can be accelerated to $O(1/t^2)$ if the objective is strongly convex.
no code implementations • 19 May 2016 • Xiang Gao, Yangyang Xu, Shuzhong Zhang
Assuming mere convexity, we establish its $O(1/t)$ convergence rate in terms of the objective value and feasibility measure.
no code implementations • 9 May 2016 • Bo Jiang, Tianyi Lin, Shiqian Ma, Shuzhong Zhang
In particular, we consider in this paper some constrained nonconvex optimization models in block decision variables, with or without coupled affine constraints.
no code implementations • 16 May 2015 • Tianyi Lin, Shiqian Ma, Shuzhong Zhang
The alternating direction method of multipliers (ADMM) has been successfully applied to solve structured convex optimization problems due to its superior practical performance.
no code implementations • 27 Jan 2013 • Tianyi Lin, Shiqian Ma, Shuzhong Zhang
The classical alternating direction type methods usually assume that the two convex functions have relatively easy proximal mappings.