no code implementations • 29 Aug 2024 • Frank E. Curtis, Xin Jiang, Qi Wang
An interior-point algorithm framework is proposed, analyzed, and tested for solving nonlinearly constrained continuous optimization problems.
no code implementations • 7 Aug 2023 • Frank E. Curtis, Xin Jiang, Qi Wang
In this paper, new almost-sure convergence guarantees for the primal iterates, Lagrange multipliers, and stationarity measures generated by a stochastic SQP algorithm in this subclass of methods are proved.
no code implementations • 28 Apr 2023 • Frank E. Curtis, Vyacheslav Kungurtsev, Daniel P. Robinson, Qi Wang
A stochastic-gradient-based interior-point algorithm for minimizing a continuously differentiable objective function (that may be nonconvex) subject to bound constraints is presented, analyzed, and demonstrated through experimental results.
1 code implementation • 24 Jun 2021 • Albert S. Berahas, Frank E. Curtis, Michael J. O'Neill, Daniel P. Robinson
A sequential quadratic optimization algorithm is proposed for solving smooth nonlinear equality constrained optimization problems in which the objective function is defined by an expectation of a stochastic function.
1 code implementation • 29 Jul 2020 • Frank E. Curtis, Yutong Dai, Daniel P. Robinson
We consider the problem of minimizing an objective function that is the sum of a convex function and a group sparsity-inducing regularizer.
Optimization and Control 49M37, 65K05, 65K10, 65Y20, 68Q25, 90C30, 90C60
1 code implementation • 20 Jul 2020 • Albert Berahas, Frank E. Curtis, Daniel P. Robinson, Baoyu Zhou
It is assumed in this setting that it is intractable to compute objective function and derivative values explicitly, although one can compute stochastic function and gradient estimates.
1 code implementation • 15 May 2020 • Frank E. Curtis, Minhan Li
In this paper, a strategy is proposed that allows the use of inexact solutions of these subproblems, which, as proved in the paper, can be incorporated without the loss of theoretical convergence guarantees.
Optimization and Control
no code implementations • 18 Jan 2020 • Frank E. Curtis, Katya Scheinberg
Optimization lies at the heart of machine learning and signal processing.
no code implementations • 29 Dec 2017 • Frank E. Curtis, Katya Scheinberg, Rui Shi
An algorithm is proposed for solving stochastic and finite sum minimization problems.
1 code implementation • 14 Nov 2017 • Chenxin Ma, Martin Jaggi, Frank E. Curtis, Nathan Srebro, Martin Takáč
In this paper, an accelerated variant of CoCoA+ is proposed and shown to possess a convergence rate of $\mathcal{O}(1/t^2)$ in terms of reducing suboptimality.
1 code implementation • 30 Jun 2017 • Frank E. Curtis, Katya Scheinberg
We then discuss some of the distinctive features of these optimization problems, focusing on the examples of logistic regression and the training of deep neural networks.
4 code implementations • 15 Jun 2016 • Léon Bottou, Frank E. Curtis, Jorge Nocedal
This paper provides a review and commentary on the past, present, and future of numerical optimization algorithms in the context of machine learning applications.
no code implementations • 10 Aug 2015 • Zheng Han, Frank E. Curtis
Isotonic regression (IR) is a non-parametric calibration method used in supervised learning.