Attention Mechanisms

Self-supervised Equivariant Attention Mechanism

Introduced by Wang et al. in Self-supervised Equivariant Attention Mechanism for Weakly Supervised Semantic Segmentation

Self-supervised Equivariant Attention Mechanism, or SEAM, is an attention mechanism for weakly supervised semantic segmentation. The SEAM applies consistency regularization on CAMs from various transformed images to provide self-supervision for network learning. To further improve the network prediction consistency, SEAM introduces the pixel correlation module (PCM), which captures context appearance information for each pixel and revises original CAMs by learned affinity attention maps. The SEAM is implemented by a siamese network with equivariant cross regularization (ECR) loss, which regularizes the original CAMs and the revised CAMs on different branches.

Source: Self-supervised Equivariant Attention Mechanism for Weakly Supervised Semantic Segmentation

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories