Speedup Matrix Completion with Side Information: Application to Multi-Label Learning

NeurIPS 2013  ·  Miao Xu, Rong Jin, Zhi-Hua Zhou ·

In standard matrix completion theory, it is required to have at least $O(n\ln^2 n)$ observed entries to perfectly recover a low-rank matrix $M$ of size $n\times n$, leading to a large number of observations when $n$ is large. In many real tasks, side information in addition to the observed entries is often available. In this work, we develop a novel theory of matrix completion that explicitly explore the side information to reduce the requirement on the number of observed entries. We show that, under appropriate conditions, with the assistance of side information matrices, the number of observed entries needed for a perfect recovery of matrix $M$ can be dramatically reduced to $O(\ln n)$. We demonstrate the effectiveness of the proposed approach for matrix completion in transductive incomplete multi-label learning.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here