no code implementations • NeurIPS 2023 • Rachel Ward, Tamara G. Kolda

We show that, for a rank-$r$ matrix $\mathbf{A} \in \mathbb{R}^{m \times n}$, $T = C (\frac{\sigma_1(\mathbf{A})}{\sigma_r(\mathbf{A})})^2 \log(1/\epsilon)$ iterations of alternating gradient descent suffice to reach an $\epsilon$-optimal factorization $\| \mathbf{A} - \mathbf{X} \mathbf{Y}^{T} \|^2 \leq \epsilon \| \mathbf{A}\|^2$ with high probability starting from an atypical random initialization.

1 code implementation • 14 Feb 2022 • João M. Pereira, Joe Kileel, Tamara G. Kolda

In this work, we develop theory and numerical methods for \emph{implicit computations} with moment tensors of GMMs, reducing the computational and storage costs to $\mathcal{O}(n^2)$ and $\mathcal{O}(n^3)$, respectively, for general covariance matrices, and to $\mathcal{O}(n)$ and $\mathcal{O}(n)$, respectively, for diagonal ones.

1 code implementation • 27 Oct 2021 • Eric Phipps, Nick Johnson, Tamara G. Kolda

In this paper, we develop a method which we call OnlineGCP for computing the Generalized Canonical Polyadic (GCP) tensor decomposition of streaming data.

no code implementations • 19 Apr 2021 • Aydin Buluc, Tamara G. Kolda, Stefan M. Wild, Mihai Anitescu, Anthony DeGennaro, John Jakeman, Chandrika Kamath, Ramakrishnan Kannan, Miles E. Lopes, Per-Gunnar Martinsson, Kary Myers, Jelani Nelson, Juan M. Restrepo, C. Seshadhri, Draguna Vrabie, Brendt Wohlberg, Stephen J. Wright, Chao Yang, Peter Zwart

Randomized algorithms have propelled advances in artificial intelligence and represent a foundational research area in advancing AI for Science.

no code implementations • 4 Jun 2019 • Tamara G. Kolda, David Hong

The stochastic gradient is formed from randomly sampled elements of the tensor and is efficient because it can be computed using the sparse matricized-tensor-times-Khatri-Rao product (MTTKRP) tensor kernel.

no code implementations • 22 Aug 2018 • David Hong, Tamara G. Kolda, Jed A. Duersch

Tensor decomposition is a fundamental unsupervised machine learning method in data science, with applications including network analysis and sensor data processing.

no code implementations • 22 Aug 2018 • Clifford Anderson-Bergman, Tamara G. Kolda, Kina Kincher-Winoto

If some marginals are continuous but not normal, the semiparametric copula-based principal component analysis (COCA) method is an alternative to PCA that combines a Gaussian copula with nonparametric marginals.

1 code implementation • 15 Dec 2011 • C. Seshadhri, Tamara G. Kolda, Ali Pinar

Community structure plays a significant role in the analysis of social networks and similar graphs, yet this structure is little understood and not well captured by most models.

Social and Information Networks Physics and Society

no code implementations • 11 Dec 2011 • Eric C. Chi, Tamara G. Kolda

We present a new algorithm for Poisson tensor factorization called CANDECOMP-PARAFAC Alternating Poisson Regression (CP-APR) that is based on a majorization-minimization approach.

Numerical Analysis

1 code implementation • 21 May 2010 • Daniel M. Dunlavy, Tamara G. Kolda, Evrim Acar

We show how the well-known Katz method for link prediction can be extended to bipartite graphs and, moreover, approximated in a scalable way using a truncated singular value decomposition.

no code implementations • 12 May 2010 • Evrim Acar, Tamara G. Kolda, Daniel M. Dunlavy, Morten Morup

In the presence of missing data, CP can be formulated as a weighted least squares problem that models only the known entries.

Numerical Analysis Numerical Analysis Data Analysis, Statistics and Probability G.1.3; G.1.6

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.