Attention

A novel built-in attention mechanism, that is complementary to all other prior attention mechanisms (e.g. squeeze and excitation, transformers) that are external (i.e., not built-in - please read paper for more details)

Source: Weight Excitation: Built-in Attention Mechanisms in Convolutional Neural Networks

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories