DLA, or Deep Layer Aggregation, iteratively and hierarchically merges the feature hierarchy across layers in neural networks to make networks with better accuracy and fewer parameters.
In iterative deep aggregation (IDA), aggregation begins at the shallowest, smallest scale and then iteratively merges deeper, larger scales. In this way shallow features are refined as they are propagated through different stages of aggregation.
In hierarchical deep aggregation (HDA), blocks and stages in a tree are merged to preserve and combine feature channels. With HDA shallower and deeper layers are combined to learn richer combinations that span more of the feature hierarchy. While IDA effectively combines stages, it is insufficient for fusing the many blocks of a network, as it is still only sequential.Source: Deep Layer Aggregation
|🤖 No Components Found||You can add them if they exist; e.g. Mask R-CNN uses RoIAlign|