no code implementations • 19 Jul 2022 • Danit Shifman Abukasis, Izack Cohen, Xiaochen Xian, Kejun Huang, Gonen Singer
Resource-constrained classification tasks are common in real-world applications such as allocating tests for disease diagnosis, hiring decisions when filling a limited number of positions, and defect detection in manufacturing settings under a limited inspection budget.
no code implementations • 31 Jan 2022 • Cheng Qian, Kejun Huang, Lucas Glass, Rakshith S. Srinivasa, Jimeng Sun
Tensor completion aims at imputing missing entries from a partially observed tensor.
no code implementations • 29 Sep 2021 • Aysegul Bumin, Kejun Huang
In this paper, we study the stochastic proximal point algorithm (SPPA) for general empirical risk minimization (ERM) problems as well as deep learning problems.
no code implementations • 1 Jan 2021 • Aysegul Bumin, Kejun Huang
SPPA has been shown to converge faster and more stable than the celebrated stochastic gradient descent (SGD) algorithm, and its many variations, for convex problems.
no code implementations • NeurIPS 2020 • Songtao Lu, Meisam Razaviyayn, Bo Yang, Kejun Huang, Mingyi Hong
To the best of our knowledge, this is the first time that first-order algorithms with polynomial per-iteration complexity and global sublinear rate are designed to find SOSPs of the important class of non-convex problems with linear constraints (almost surely).
no code implementations • 15 Jun 2020 • Xiao Fu, Nico Vervliet, Lieven De Lathauwer, Kejun Huang, Nicolas Gillis
The proposed article aims at offering a comprehensive tutorial for the computational aspects of structured matrix and tensor factorization.
no code implementations • NeurIPS 2019 • Shahana Ibrahim, Xiao Fu, Nikos Kargas, Kejun Huang
The data deluge comes with high demands for data labeling.
no code implementations • 9 Jul 2019 • Songtao Lu, Meisam Razaviyayn, Bo Yang, Kejun Huang, Mingyi Hong
This paper proposes low-complexity algorithms for finding approximate second-order stationary points (SOSPs) of problems with smooth non-convex objective and linear constraints.
no code implementations • 16 Jan 2019 • Xiao Fu, Shahana Ibrahim, Hoi-To Wai, Cheng Gao, Kejun Huang
In this work, we propose a stochastic optimization framework for large-scale CPD with constraints/regularizations.
no code implementations • 6 Jan 2019 • Bo Yang, Xiao Fu, Nicholas D. Sidiropoulos, Kejun Huang
Linear mixture models have proven very useful in a plethora of applications, e. g., topic modeling, clustering, and source separation.
no code implementations • 3 Mar 2018 • Xiao Fu, Kejun Huang, Nicholas D. Sidiropoulos, Wing-Kin Ma
Perhaps a bit surprisingly, the understanding to its model identifiability---the major reason behind the interpretability in many applications such as topic mining and hyperspectral imaging---had been rather limited until recent years.
no code implementations • ICML 2018 • Kejun Huang, Xiao Fu, Nicholas D. Sidiropoulos
We present a new algorithm for identifying the transition and emission probabilities of a hidden Markov model (HMM) from the emitted data.
no code implementations • 21 Nov 2017 • Kejun Huang, Nicholas D. Sidiropoulos
We study the problem of nonnegative rank-one approximation of a nonnegative tensor, and show that the globally optimal solution that minimizes the generalized Kullback-Leibler divergence can be efficiently obtained, i. e., it is not NP-hard.
no code implementations • 20 Nov 2017 • Kejun Huang, Xiao Fu, Nicholas D. Sidiropoulos
However, since the procedure involves non-smooth kernel density functions, the convergence behavior of Epanechnikov mean shift lacks theoretical support as of this writing---most of the existing analyses are based on smooth functions and thus cannot be applied to Epanechnikov Mean Shift.
no code implementations • 2 Sep 2017 • Xiao Fu, Kejun Huang, Nicholas D. Sidiropoulos
In this letter, we propose a new identification criterion that guarantees the recovery of the low-rank latent factors in the nonnegative matrix factorization (NMF) model, under mild conditions.
no code implementations • NeurIPS 2016 • Kejun Huang, Xiao Fu, Nicholas D. Sidiropoulos
In topic modeling, many algorithms that guarantee identifiability of the topics have been developed under the premise that there exist anchor words -- i. e., words that only appear (with positive probability) in one topic.
no code implementations • 15 Aug 2016 • Xiao Fu, Kejun Huang, Bo Yang, Wing-Kin Ma, Nicholas D. Sidiropoulos
This paper considers \emph{volume minimization} (VolMin)-based structured matrix factorization (SMF).
no code implementations • 6 Jul 2016 • Nicholas D. Sidiropoulos, Lieven De Lathauwer, Xiao Fu, Kejun Huang, Evangelos E. Papalexakis, Christos Faloutsos
Tensors or {\em multi-way arrays} are functions of three or more indices $(i, j, k,\cdots)$ -- similar to matrices (two-way arrays), which are functions of two indices $(r, c)$ for (row, column).
no code implementations • 31 May 2016 • Xiao Fu, Kejun Huang, Mingyi Hong, Nicholas D. Sidiropoulos, Anthony Man-Cho So
Generalized canonical correlation analysis (GCCA) aims at finding latent low-dimensional common structure from multiple views (feature vectors in different domains) of the same entities.
no code implementations • 16 Jul 2015 • Xiao Fu, Kejun Huang, Wing-Kin Ma, Nicholas D. Sidiropoulos, Rasmus Bro
Convergence of the proposed algorithm is also easy to analyze under the framework of alternating optimization and its variants.
no code implementations • 13 Jun 2015 • Kejun Huang, Nicholas D. Sidiropoulos, Athanasios P. Liavas
We propose a general algorithmic framework for constrained matrix and tensor factorization, which is widely used in signal processing and machine learning.