no code implementations • 9 Nov 2023 • Luciano Vinas, Arash A. Amini
Alternately, the problem can be viewed as clustering in the presence of structured (smooth) contaminant.
1 code implementation • 8 Oct 2023 • Luciano Vinas, Arash A. Amini
We posit that many of the current GNN architectures may be over-engineered.
no code implementations • 18 Jul 2023 • Nathaniel Josephs, Arash A. Amini, Marina Paez, Lizhen Lin
We introduce the nested stochastic block model (NSBM) to cluster a collection of networks while simultaneously detecting communities within each network.
1 code implementation • 27 Sep 2022 • Luciano Vinas, Arash A. Amini, Jade Fischer, Atchar Sudhyadhom
A spatially regularized Gaussian mixture model, LapGM, is proposed for the bias field correction and magnetic resonance normalization problem.
no code implementations • 28 Jun 2022 • Arash A. Amini, Richard Baumgartner, Dai Feng
We show that for polynomial alignment, there is an \emph{over-aligned} regime, in which TKRR can achieve a faster rate than what is achievable by full KRR.
no code implementations • 12 Jun 2022 • Arash A. Amini, Bryon Aragam, Qing Zhou
We introduce and study the neighbourhood lattice decomposition of a distribution, which is a compact, non-graphical representation of conditional independence that is valid in the absence of a faithful graphical representation.
no code implementations • 23 Jan 2022 • Qiaoling Ye, Arash A. Amini, Qing Zhou
We consider the task of learning causal structures from data stored on multiple machines, and propose a novel structure learning method called distributed annealing on regularized likelihood score (DARLS) to solve this problem.
1 code implementation • 30 Dec 2020 • Linfan Zhang, Arash A. Amini
When applied sequentially, the test can also be used to determine the number of communities.
no code implementations • 7 Sep 2019 • Arash A. Amini, Zahra S. Razaee
We derive nonasymptotic exponential concentration inequalities for Lipschitz kernels assuming that the data points are independent draws from a class of multivariate distributions on $\mathbb R^d$, including the strongly log-concave distributions under affine transformations.
no code implementations • 3 Sep 2019 • Arash A. Amini, Bryon Aragam, Qing Zhou
Knowing when a graphical model is perfect to a distribution is essential in order to relate separation in the graph to conditional independence in the distribution, and this is particularly important when performing inference from data.
no code implementations • 14 Jun 2019 • Arash A. Amini
We show that, as long as the RKHS is infinite-dimensional, there is a threshold on $r$, above which, the spectrally-truncated KRR surprisingly outperforms the full KRR in terms of the minimax risk, where the minimum is taken over the regularization parameter.
1 code implementation • 28 Apr 2019 • Qiaoling Ye, Arash A. Amini, Qing Zhou
We propose a novel structure learning method, annealing on regularized Cholesky score (ARCS), to search over topological sorts, or permutations of nodes, for a high-scoring Bayesian network.
1 code implementation • 30 Mar 2019 • Arash A. Amini, Marina S. Paez, Lizhen Lin
Moreover, our model automatically picks up the necessary number of communities at each layer (as validated by real data examples).
2 code implementations • 21 Mar 2019 • Arash A. Amini, Marina Paez, Lizhen Lin, Zahra S. Razaee
We propose an exact slice sampler for Hierarchical Dirichlet process (HDP) and its associated mixture models (Teh et al., 2006).
no code implementations • 19 Mar 2019 • Parthe Pandit, Mojtaba Sahraee-Ardakan, Arash A. Amini, Sundeep Rangan, Alyson K. Fletcher
We derive precise upper bounds on the mean-squared estimation error in terms of the number of samples, dimensions of the process, the lag $p$ and other key statistical properties of the model.
no code implementations • 15 Mar 2018 • Zhixin Zhou, Arash A. Amini
The provable algorithm is derived from a general class of pseudo-likelihood biclustering algorithms that employ simple EM type updates.
no code implementations • 12 Mar 2018 • Zhixin Zhou, Arash A. Amini
We consider spectral clustering algorithms for community detection under a general bipartite stochastic block model (SBM).
1 code implementation • 3 Nov 2017 • Arash A. Amini, Bryon Aragam, Qing Zhou
We study the computational complexity of computing these structures and show that under a sparsity assumption, they can be computed in polynomial time, even in the absence of the assumption of perfectness to a graph.
no code implementations • 15 Mar 2017 • Zahra S. Razaee, Arash A. Amini, Jingyi Jessica Li
Community detection or clustering is a fundamental task in the analysis of network data.
1 code implementation • 29 Nov 2015 • Bryon Aragam, Arash A. Amini, Qing Zhou
We study a family of regularized score-based estimators for learning the structure of a directed acyclic graph (DAG) for a multivariate normal distribution from high-dimensional data with $p\gg n$.
1 code implementation • 21 Jun 2014 • Arash A. Amini, Elizaveta Levina
We put ours and previously proposed SDPs in a unified framework, as relaxations of the MLE over various sub-classes of the SBM, revealing a connection to sparse PCA.
no code implementations • NeurIPS 2013 • Arash A. Amini, XuanLong Nguyen
We propose a general formalism of iterated random functions with semigroup property, under which exact and approximate Bayesian posterior updates can be viewed as specific instances.
no code implementations • 10 Jul 2012 • Arash A. Amini, Aiyou Chen, Peter J. Bickel, Elizaveta Levina
Many algorithms have been proposed for fitting network models with communities, but most of them do not scale well to large networks, and often fail on sparse networks.