Attention Modules

Slot Attention

Introduced by Locatello et al. in Object-Centric Learning with Slot Attention

Slot Attention is an architectural component that interfaces with perceptual representations such as the output of a convolutional neural network and produces a set of task-dependent abstract representations which we call slots. These slots are exchangeable and can bind to any object in the input by specializing through a competitive procedure over multiple rounds of attention. Using an iterative attention mechanism, slots produces a set of output vectors with permutation symmetry. Unlike capsules used in Capsule Networks, slots produced by Slot Attention do not specialize to one particular type or class of object, which could harm generalization. Instead, they act akin to object files, i.e., slots use a common representational format: each slot can store (and bind to) any object in the input. This allows Slot Attention to generalize in a systematic way to unseen compositions, more objects, and more slots.

Source: Object-Centric Learning with Slot Attention


Paper Code Results Date Stars


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign