Sketch-based Manga Retrieval using Manga109 Dataset

15 Oct 2015  ·  Yusuke Matsui, Kota Ito, Yuji Aramaki, Toshihiko Yamasaki, Kiyoharu Aizawa ·

Manga (Japanese comics) are popular worldwide. However, current e-manga archives offer very limited search support, including keyword-based search by title or author, or tag-based categorization. To make the manga search experience more intuitive, efficient, and enjoyable, we propose a content-based manga retrieval system. First, we propose a manga-specific image-describing framework. It consists of efficient margin labeling, edge orientation histogram feature description, and approximate nearest-neighbor search using product quantization. Second, we propose a sketch-based interface as a natural way to interact with manga content. The interface provides sketch-based querying, relevance feedback, and query retouch. For evaluation, we built a novel dataset of manga images, Manga109, which consists of 109 comic books of 21,142 pages drawn by professional manga artists. To the best of our knowledge, Manga109 is currently the biggest dataset of manga images available for research. We conducted a comparative study, a localization evaluation, and a large-scale qualitative study. From the experiments, we verified that: (1) the retrieval accuracy of the proposed method is higher than those of previous methods; (2) the proposed method can localize an object instance with reasonable runtime and accuracy; and (3) sketch querying is useful for manga search.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


Introduced in the Paper:

Manga109

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here