no code implementations • 11 Aug 2024 • Brett W. Larsen, Tamara G. Kolda, Anru R. Zhang, Alex H. Williams

We refer to tensors with some infinite-dimensional modes as quasitensors, and the approach of decomposing a tensor with some continuous RKHS modes is referred to as CP-HiFi (hybrid infinite and finite dimensional) tensor decomposition.

1 code implementation • 23 Oct 2023 • Muhang Tian, Bernie Chen, Allan Guo, Shiyi Jiang, Anru R. Zhang

The proposed diffusion-model-based method can reliably and efficiently generate synthetic EHR time series, which facilitates the downstream medical data analysis.

no code implementations • 2 Jul 2023 • Runshi Tang, Ming Yuan, Anru R. Zhang

The MOP-UP algorithm consists of two steps: Average Subspace Capture (ASC) and Alternating Projection (AP).

no code implementations • 9 Mar 2023 • Jiashun Jin, Zheng Tracy Ke, Paxton Turner, Anru R. Zhang

Using a degree-corrected block model (DCBM), we establish phase transitions of this testing problem concerning the size of the small community and the edge densities in small and large communities.

no code implementations • 26 Sep 2022 • Chenyin Gao, Shu Yang, Anru R. Zhang

With the proposed design, we are able to characterize our denoiser with fewer parameters and train it based on a single image, which considerably improves the model's generalizability and reduces the cost of data acquisition.

no code implementations • 22 Sep 2022 • Sitan Chen, Sinho Chewi, Jerry Li, Yuanzhi Li, Adil Salim, Anru R. Zhang

We provide theoretical convergence guarantees for score-based generative models (SGMs) such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of large-scale real-world generative models such as DALL$\cdot$E 2.

1 code implementation • 17 Jun 2022 • Yuetian Luo, Anru R. Zhang

We study the tensor-on-tensor regression, where the goal is to connect tensor responses to tensor covariates with a low Tucker rank parameter tensor/matrix without the prior knowledge of its intrinsic rank.

no code implementations • 8 Apr 2022 • Sitan Chen, Jerry Li, Yuanzhi Li, Anru R. Zhang

Our first main result is a polynomial-time algorithm for learning quadratic transformations of Gaussians in a smoothed setting.

no code implementations • 23 Oct 2021 • Yuetian Luo, Xudong Li, Anru R. Zhang

By applying the general procedure to the fixed-rank positive semidefinite (PSD) and general matrix optimization, we establish an exact Riemannian gradient connection under two geometries at every point on the manifold and sandwich inequalities between the spectra of Riemannian Hessians at Riemannian first-order stationary points (FOSPs).

no code implementations • 3 Aug 2021 • Yuetian Luo, Xudong Li, Anru R. Zhang

In this paper, we consider the geometric landscape connection of the widely studied manifold and factorization formulations in low-rank positive semidefinite (PSD) and general matrix optimization.

1 code implementation • 24 Apr 2021 • Yuetian Luo, Anru R. Zhang

In this paper, we consider the estimation of a low Tucker rank tensor from a number of noisy linear measurements.

no code implementations • 29 Dec 2020 • Dong Xia, Anru R. Zhang, Yuchen Zhou

In all these models, we observe that different from many matrix/vector settings in existing work, debiasing is not required to establish the asymptotic distribution of estimates or to make statistical inference on low-rank tensors.

1 code implementation • 18 Dec 2020 • Rungang Han, Yuetian Luo, Miaoyan Wang, Anru R. Zhang

High-order clustering aims to identify heterogeneous substructures in multiway datasets that arise commonly in neuroimaging, genomics, social network studies, etc.

no code implementations • 17 Nov 2020 • Yuetian Luo, Wen Huang, Xudong Li, Anru R. Zhang

In this paper, we propose {\it \underline{R}ecursive} {\it \underline{I}mportance} {\it \underline{S}ketching} algorithm for {\it \underline{R}ank} constrained least squares {\it \underline{O}ptimization} (RISRO).

1 code implementation • 6 Oct 2020 • Yuchen Zhou, Anru R. Zhang, Lili Zheng, Yazhen Wang

This paper studies a general framework for high-order tensor SVD.

no code implementations • 12 Sep 2020 • Yuetian Luo, Anru R. Zhang

We note the significance of hypergraphic planted clique (HPC) detection in the investigation of computational hardness for a range of tensor problems.

no code implementations • 6 Aug 2020 • Yuetian Luo, Garvesh Raskutti, Ming Yuan, Anru R. Zhang

Rate matching deterministic lower bound for tensor reconstruction, which demonstrates the optimality of HOOI, is also provided.

no code implementations • 21 May 2020 • Yuetian Luo, Anru R. Zhang

We also develop the tight computational thresholds: when the signal-to-noise ratio is below these thresholds, we prove that polynomial-time algorithms cannot solve these problems under the computational hardness conjectures of hypergraphic planted clique (HPC) detection and hypergraphic planted dense subgraph (HPDS) recovery.

no code implementations • 26 Feb 2020 • Rungang Han, Rebecca Willett, Anru R. Zhang

Under mild conditions on the loss function, we establish both an upper bound on statistical error and the linear rate of computational convergence through a general deterministic analysis.

no code implementations • 21 Sep 2019 • T. Tony Cai, Anru R. Zhang, Yuchen Zhou

We study sparse group Lasso for high-dimensional double sparse linear regression, where the parameter of interest is simultaneously element-wise and group-wise sparse.

no code implementations • 21 Oct 2018 • Anru R. Zhang, Yuchen Zhou

The non-asymptotic tail bounds of random variables play crucial roles in probability, statistics, and machine learning.

no code implementations • 19 Oct 2018 • Anru R. Zhang, T. Tony Cai, Yihong Wu

A general framework for principal component analysis (PCA) in the presence of heteroskedastic noise is introduced.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.