no code implementations • 10 Jun 2024 • Chung Kyong Nguen, Oscar Hernan Madrid Padilla, Arash A. Amini
We consider the two-sample testing problem for networks, where the goal is to determine whether two sets of networks originated from the same stochastic model.
no code implementations • 2 Feb 2024 • Marcos Matabuena, Juan C. Vidal, Oscar Hernan Madrid Padilla, Jukka-Pekka Onnela
In this paper, we introduce a kNN-based regression method that synergizes the scalability and adaptability of traditional non-parametric kNN models with a novel variable selection technique.
no code implementations • 30 Aug 2023 • Carlos Misael Madrid Padilla, Oscar Hernan Madrid Padilla, Daren Wang
In such a context, we study the Trend Filtering, a nonparametric estimator introduced by \cite{mammen1997locally} and \cite{rudin1992nonlinear}.
no code implementations • 7 Aug 2022 • Marcos Matabuena, J. C Vidal, Oscar Hernan Madrid Padilla, Dino Sejdinovic
Biclustering algorithms partition data and covariates simultaneously, providing new insights in several domains, such as analyzing gene expression to discover new biological functions.
no code implementations • 26 Jul 2022 • Oscar Hernan Madrid Padilla
We study the problem of variance estimation in general graph-structured problems.
no code implementations • 3 Feb 2022 • Gabriel Ruiz, Oscar Hernan Madrid Padilla, Qing Zhou
We demonstrate a novel application of this general approach to estimate the topological ordering of a DAG.
1 code implementation • 25 Jun 2021 • Haiyan Jiang, Shanshan Qin, Oscar Hernan Madrid Padilla
In this paper, we consider a new variant for principal component analysis (PCA), aiming to capture the grouping and/or sparse structures of factor loadings simultaneously.
1 code implementation • NeurIPS 2021 • Oscar Hernan Madrid Padilla, Yi Yu, Alessandro Rinaldo
We study piece-wise constant signals corrupted by additive Gaussian noise over a $d$-dimensional lattice.
no code implementations • 14 Jan 2021 • Yi Yu, Oscar Hernan Madrid Padilla, Daren Wang, Alessandro Rinaldo
The goal is to detect the change point as quickly as possible, if it exists, subject to a constraint on the number or probability of false alarms.
1 code implementation • 16 Oct 2020 • Oscar Hernan Madrid Padilla, Wesley Tansey, Yanzhen Chen
Overall, the theoretical and empirical results provide insight into the strong performance of ReLU neural networks for quantile regression across a broad range of function classes and error distributions.
no code implementations • 2 Sep 2020 • Alfonso Landeros, Oscar Hernan Madrid Padilla, Hua Zhou, Kenneth Lange
The current paper studies the problem of minimizing a loss $f(\boldsymbol{x})$ subject to constraints of the form $\boldsymbol{D}\boldsymbol{x} \in S$, where $S$ is a closed set, convex or not, and $\boldsymbol{D}$ is a matrix that fuses parameters.
no code implementations • 4 Dec 2019 • Alexandre Belloni, Mingli Chen, Oscar Hernan Madrid Padilla, Zixuan, Wang
We propose a generalization of the linear panel quantile regression model to accommodate both \textit{sparse} and \textit{dense} parts: sparse means while the number of covariates available is large, potentially only a much smaller number of them have a nonzero impact on each conditional quantile of the response variable; while the dense part is represent by a low-rank matrix that can be approximated by latent factors and their loadings.
no code implementations • 26 May 2019 • Hangjian Li, Oscar Hernan Madrid Padilla, Qing Zhou
Structural learning of directed acyclic graphs (DAGs) or Bayesian networks has been studied extensively under the assumption that data are independent.
1 code implementation • 19 May 2015 • Wesley Tansey, Oscar Hernan Madrid Padilla, Arun Sai Suggala, Pradeep Ravikumar
Specifically, VS-MRFs are the joint graphical model distributions where the node-conditional distributions belong to generic exponential families with general vector space domains.
no code implementations • 24 Feb 2015 • Oscar Hernan Madrid Padilla, James G. Scott
We present an approach for penalized tensor decomposition (PTD) that estimates smoothly varying latent factors in multi-way data.
no code implementations • 12 Apr 2014 • Mingyuan Zhou, Oscar Hernan Madrid Padilla, James G. Scott
We define a family of probability distributions for random count matrices with a potentially unbounded number of rows and columns.