The Maxout Unit is a generalization of the ReLU and the leaky ReLU functions. It is a piecewise linear function that returns the maximum of the inputs, designed to be used in conjunction with dropout. Both ReLU and leaky ReLU are special cases of Maxout.
$$f\left(x\right) = \max\left(w^{T}_{1}x + b_{1}, w^{T}_{2}x + b_{2}\right)$$
The main drawback of Maxout is that it is computationally expensive as it doubles the number of parameters for each neuron.
Source: Maxout NetworksPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 9 | 15.00% |
General Classification | 8 | 13.33% |
Speech Recognition | 4 | 6.67% |
Object Detection | 3 | 5.00% |
Semantic Segmentation | 3 | 5.00% |
Image Generation | 2 | 3.33% |
Graph Classification | 2 | 3.33% |
Object Recognition | 2 | 3.33% |
Face Verification | 2 | 3.33% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |