Spectral Embedding Norm: Looking Deep into the Spectrum of the Graph Laplacian

25 Oct 2018  ·  Xiuyuan Cheng, Gal Mishne ·

The extraction of clusters from a dataset which includes multiple clusters and a significant background component is a non-trivial task of practical importance. In image analysis this manifests for example in anomaly detection and target detection. The traditional spectral clustering algorithm, which relies on the leading $K$ eigenvectors to detect $K$ clusters, fails in such cases. In this paper we propose the {\it spectral embedding norm} which sums the squared values of the first $I$ normalized eigenvectors, where $I$ can be significantly larger than $K$. We prove that this quantity can be used to separate clusters from the background in unbalanced settings, including extreme cases such as outlier detection. The performance of the algorithm is not sensitive to the choice of $I$, and we demonstrate its application on synthetic and real-world remote sensing and neuroimaging datasets.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods