Binarized Mode Seeking for Scalable Visual Pattern Discovery

CVPR 2017  ·  Wei Zhang, Xiaochun Cao, Rui Wang, Yuanfang Guo, Zhineng Chen ·

This paper studies visual pattern discovery in large-scale image collections via binarized mode seeking, where images can only be represented as binary codes for efficient storage and computation. We address this problem from the perspective of binary space mode seeking. First, a binary mean shift (bMS) is proposed to discover frequent patterns via mode seeking directly in binary space. The binomial-based kernel and binary constraint are introduced for binarized analysis. Second, we further extend bMS to a more general form, namely contrastive binary mean shift (cbMS), which maximizes the contrastive density in binary space, for finding informative patterns that are both frequent and discriminative for the dataset. With the binarized algorithm and optimization, our methods demonstrate significant computation (50X) and storage (32X) improvement compared to standard techniques operating in Euclidean space, while the performance does not largely degenerate. Furthermore, cbMS discovers more informative patterns by suppressing low discriminative modes. We evaluate our methods on both annotated ILSVRC (1M images) and un-annotated blind Flickr (10M images) datasets with million scale images, which demonstrates both the scalability and effectiveness of our algorithms for discovering frequent and informative patterns in large scale collection.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here