Skip Connection Blocks

Big-Little Modules are blocks for image models that have two branches: each of which represents a separate block from a deep model and a less deep counterpart. They were proposed as part of the BigLittle-Net architecture. The two branches are fused with a linear combination and unit weights. These two branches are known as Big-Branch (more layers and channels at low resolutions) and Little-Branch (fewer layers and channels at high resolution).

Source: Big-Little Net: An Efficient Multi-Scale Feature Representation for Visual and Speech Recognition

Papers


Paper Code Results Date Stars

Categories