no code implementations • 30 Jan 2024 • Juefei Chen, Longxiu Huang, Yimin Wei
This approach extends the coseparable NMF to the tensor setting, creating what we term coseparable Nonnegative Tensor Factorization (NTF).
no code implementations • 28 Jan 2024 • HanQin Cai, Longxiu Huang, Chandra Kundu, Bowen Su
Matrix completion is one of the crucial tools in modern data science research.
1 code implementation • 6 May 2023 • HanQin Cai, Zehan Chao, Longxiu Huang, Deanna Needell
We study the tensor robust principal component analysis (TRPCA) problem, a tensorial extension of matrix robust principal component analysis (RPCA), that aims to split the given tensor into an underlying low-rank component and a sparse outlier component.
no code implementations • 17 Mar 2023 • Zheng Tan, Longxiu Huang, HanQin Cai, Yifei Lou
Tensor completion is an important problem in modern data analysis.
no code implementations • 24 Feb 2023 • Longxiu Huang, Xia Li, Deanna Needell
Additionally, the efficiency of the proposed methods for solving convex problems is shown in simulations with the presence of adversaries.
1 code implementation • 20 Aug 2022 • HanQin Cai, Longxiu Huang, Pengyu Li, Deanna Needell
While uniform sampling has been widely studied in the matrix completion literature, CUR sampling approximates a low-rank matrix via row and column samples.
no code implementations • 28 Feb 2022 • Xia Li, Longxiu Huang, Deanna Needell
Developing large-scale distributed methods that are robust to the presence of adversarial or corrupted workers is an important part of making such methods practical for real-world problems.
no code implementations • 31 Jan 2022 • Pengyu Li, Christine Tseng, Yaxuan Zheng, Joyce A. Chew, Longxiu Huang, Benjamin Jarman, Deanna Needell
Classification and topic modeling are popular techniques in machine learning that extract information from large-scale datasets.
no code implementations • 23 Aug 2021 • HanQin Cai, Zehan Chao, Longxiu Huang, Deanna Needell
We study the problem of tensor robust principal component analysis (TRPCA), which aims to separate an underlying low-multilinear-rank tensor and a sparse outlier tensor from their sum.
1 code implementation • 19 Mar 2021 • HanQin Cai, Keaton Hamm, Longxiu Huang, Deanna Needell
Low rank tensor approximation is a fundamental tool in modern machine learning and data science.
no code implementations • 5 Jan 2021 • HanQin Cai, Keaton Hamm, Longxiu Huang, Deanna Needell
Additionally, we consider hybrid randomized and deterministic sampling methods which produce a compact CUR decomposition of a given matrix, and apply this to video sequences to produce canonical frames thereof.
1 code implementation • 14 Oct 2020 • HanQin Cai, Keaton Hamm, Longxiu Huang, Jiaqi Li, Tao Wang
Robust principal component analysis (RPCA) is a widely used tool for dimension reduction.
no code implementations • EMNLP (NLP-COVID19) 2020 • Rachel Grotheer, Yihuan Huang, Pengyu Li, Elizaveta Rebrova, Deanna Needell, Longxiu Huang, Alona Kryshchenko, Xia Li, Kyung Ha, Oleksandr Kryshchenko
A dataset of COVID-19-related scientific literature is compiled, combining the articles from several online libraries and selecting those with open access and full text available.
no code implementations • 22 Mar 2019 • Keaton Hamm, Longxiu Huang
This article discusses a useful tool in dimensionality reduction and low-rank matrix approximation called the CUR decomposition.