no code implementations • ICML 2020 • Alper Atamturk, Andres Gomez
We give safe screening rules to eliminate variables from regression with L0 regularization or cardinality constraint.
no code implementations • 7 Aug 2022 • Ramtin Madani, Mersedeh Ashraphijuo, Mohsen Kheirandishfard, Alper Atamturk
In the first part of this work [32], we introduce a convex parabolic relaxation for quadratically-constrained quadratic programs, along with a sequential penalized parabolic relaxation algorithm to recover near-optimal feasible solutions.
no code implementations • 7 Aug 2022 • Ramtin Madani, Mersedeh Ashraphijuo, Mohsen Kheirandishfard, Alper Atamturk
For general quadratically-constrained quadratic programming (QCQP), we propose a parabolic relaxation described with convex quadratic constraints.
no code implementations • 1 Feb 2022 • Anna Deza, Alper Atamturk
In logistic regression, it is often desirable to utilize regularization to promote sparse solutions, particularly for problems with a large number of features compared to available labels.
1 code implementation • 17 Feb 2021 • Tuncay Altun, Ramtin Madani, Alper Atamturk, Ross Baldick, Ali Davoudi
This paper provides an enhanced modeling of the contingency response that collectively reflects high-fidelity physical and operational characteristics of power grids.
no code implementations • 29 Dec 2020 • Alper Atamturk, Andres Gomez
We show that the convex hull of the epigraph of the quadratic can be obtaining from inequalities for the underlying supermodular set function by lifting them into nonlinear inequalities in the original space of variables.
no code implementations • 31 Dec 2019 • Alper Atamturk, Vishnu Narayanan
Using polarity, we give an outer polyhedral approximation for the epigraph of set functions.
no code implementations • 29 Jan 2019 • Alper Atamturk, Andres Gomez
Sparse regression models are increasingly prevalent due to their ease of interpretability and superior out-of-sample performance.
no code implementations • 6 Nov 2018 • Alper Atamturk, Andres Gomez, Shaoning Han
Signal estimation problems with smoothness and sparsity priors can be naturally modeled as quadratic optimization with $\ell_0$-"norm" constraints.