Search Results for author: Andres Gomez

Found 12 papers, 0 papers with code

Safe screening rules for L0-regression

no code implementations ICML 2020 Alper Atamturk, Andres Gomez

We give safe screening rules to eliminate variables from regression with L0 regularization or cardinality constraint.

regression

Solution Path of Time-varying Markov Random Fields with Discrete Regularization

no code implementations25 Jul 2023 Salar Fattahi, Andres Gomez

More specifically, we show that the entire solution path of the time-varying MRF for all sparsity levels can be obtained in $\mathcal{O}(pT^3)$, where $T$ is the number of time steps and $p$ is the number of unknown parameters at any given time.

Optimal Robust Classification Trees

no code implementations AAAI Workshop AdvML 2022 Nathan Justin, Sina Aghaei, Andres Gomez, Phebe Vayanos

In many high-stakes domains, the data used to drive machine learning algorithms is noisy (due to e. g., the sensitive nature of the data being collected, limited resources available to validate the data, etc).

Classification Robust classification

Memory-Aware Partitioning of Machine Learning Applications for Optimal Energy Use in Batteryless Systems

no code implementations5 Aug 2021 Andres Gomez, Andreas Tretter, Pascal Alexander Hager, Praveenth Sanmugarajah, Luca Benini, Lothar Thiele

By leveraging interkernel data dependencies, these energy-bounded execution cycles minimize the number of system activations and nonvolatile data transfers, and thus the total energy overhead.

Total Energy

Scalable Inference of Sparsely-changing Gaussian Markov Random Fields

no code implementations NeurIPS 2021 Salar Fattahi, Andres Gomez

Most of the existing methods for the inference of time-varying Markov random fields (MRFs) rely on the \textit{regularized maximum likelihood estimation} (MLE), that typically suffer from weak statistical guarantees and high computational time.

Scalable Inference of Sparsely-changing Markov Random Fields with Strong Statistical Guarantees

no code implementations NeurIPS 2021 Salar Fattahi, Andres Gomez

In this paper, we study the problem of inferring time-varying Markov random fields (MRF), where the underlying graphical model is both sparse and changes sparsely over time.

Supermodularity and valid inequalities for quadratic optimization with indicators

no code implementations29 Dec 2020 Alper Atamturk, Andres Gomez

We show that the convex hull of the epigraph of the quadratic can be obtaining from inequalities for the underlying supermodular set function by lifting them into nonlinear inequalities in the original space of variables.

valid

Ideal formulations for constrained convex optimization problems with indicator variables

no code implementations30 Jun 2020 Linchuan Wei, Andres Gomez, Simge Kucukyavuz

Motivated by modern regression applications, in this paper, we study the convexification of a class of convex optimization problems with indicator variables and combinatorial constraints on the indicators.

regression

Learning Optimal Classification Trees: Strong Max-Flow Formulations

no code implementations21 Feb 2020 Sina Aghaei, Andres Gomez, Phebe Vayanos

To fill this gap in the literature, we propose a flow-based MIP formulation for optimal binary classification trees that has a stronger linear programming relaxation.

Binary Classification Classification +1

Rank-one Convexification for Sparse Regression

no code implementations29 Jan 2019 Alper Atamturk, Andres Gomez

Sparse regression models are increasingly prevalent due to their ease of interpretability and superior out-of-sample performance.

regression

Sparse and Smooth Signal Estimation: Convexification of L0 Formulations

no code implementations6 Nov 2018 Alper Atamturk, Andres Gomez, Shaoning Han

Signal estimation problems with smoothness and sparsity priors can be naturally modeled as quadratic optimization with $\ell_0$-"norm" constraints.

Intrusion Prevention and Detection in Grid Computing - The ALICE Case

no code implementations20 Apr 2017 Andres Gomez, Camilo Lara, Udo Kebschull

A remarkable example of a Grid in High Energy Physics (HEP) research is used in the ALICE experiment at European Organization for Nuclear Research CERN.

Cannot find the paper you are looking for? You can Submit a new open access paper.