Attention Mechanisms

Fast Voxel Query

Introduced by Mao et al. in Voxel Transformer for 3D Object Detection

Fast Voxel Query is a module used in the Voxel Transformer 3D object detection model implementation of self-attention, specifically Local and Dilated Attention. For each querying index $v_{i}$, an attending voxel index $v_{j}$ is determined by Local and Dilated Attention. Then we can lookup the non-empty index $j$ in the hash table with hashed $v_{j}$ as the key. Finally, the non-empty index $j$ is used to gather the attending feature $f_{j}$ from $\mathcal{F}$ for multi-head attention.

Source: Voxel Transformer for 3D Object Detection

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
3D Object Detection 1 25.00%
Computational Efficiency 1 25.00%
Object Detection 1 25.00%
Object Recognition 1 25.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories