Sketch-Based Image Retrieval
36 papers with code • 3 benchmarks • 4 datasets
Latest papers
It's All About Your Sketch: Democratising Sketch Control in Diffusion Models
This paper unravels the potential of sketches for diffusion models, addressing the deceptive promise of direct sketch control in generative AI.
Symmetrical Bidirectional Knowledge Alignment for Zero-Shot Sketch-Based Image Retrieval
In this paper, we propose a novel Symmetrical Bidirectional Knowledge Alignment for zero-shot sketch-based image retrieval (SBKA).
LogoNet: a fine-grained network for instance-level logo sketch retrieval
To our knowledge, this is the first publicly available instance-level logo sketch dataset.
Zero-Shot Everything Sketch-Based Image Retrieval, and in Explainable Style
This paper studies the problem of zero-short sketch-based image retrieval (ZS-SBIR), however with two significant differentiators to prior art (i) we tackle all variants (inter-category, intra-category, and cross datasets) of ZS-SBIR with just one network (``everything''), and (ii) we would really like to understand how this sketch-photo matching operates (``explainable'').
Data-Free Sketch-Based Image Retrieval
For the first time, we identify that for data-scarce tasks like Sketch-Based Image Retrieval (SBIR), where the difficulty in acquiring paired photos and hand-drawn sketches limits data-dependent cross-modal learning algorithms, DFL can prove to be a much more practical paradigm.
Photo Pre-Training, but for Sketch
This lack of sketch data has imposed on the community a few "peculiar" design choices -- the most representative of them all is perhaps the coerced utilisation of photo-based pre-training (i. e., no sketch), for many core tasks that otherwise dictates specific sketch understanding.
Cross-Modal Fusion Distillation for Fine-Grained Sketch-Based Image Retrieval
Representation learning for sketch-based image retrieval has mostly been tackled by learning embeddings that discard modality-specific information.
Test-time Training for Data-efficient UCDR
Image retrieval under generalized test scenarios has gained significant momentum in literature, and the recently proposed protocol of Universal Cross-domain Retrieval is a pioneer in this direction.
Abstracting Sketches through Simple Primitives
Toward equipping machines with such capabilities, we propose the Primitive-based Sketch Abstraction task where the goal is to represent sketches using a fixed set of drawing primitives under the influence of a budget.
Adaptive Fine-Grained Sketch-Based Image Retrieval
To solve this new problem, we introduce a novel model-agnostic meta-learning (MAML) based framework with several key modifications: (1) As a retrieval task with a margin-based contrastive loss, we simplify the MAML training in the inner loop to make it more stable and tractable.