The Maxout Unit is a generalization of the ReLU and the leaky ReLU functions. It is a piecewise linear function that returns the maximum of the inputs, designed to be used in conjunction with dropout. Both ReLU and leaky ReLU are special cases of Maxout.
$$f\left(x\right) = \max\left(w^{T}_{1}x + b_{1}, w^{T}_{2}x + b_{2}\right)$$
The main drawback of Maxout is that it is computationally expensive as it doubles the number of parameters for each neuron.
Source: Maxout NetworksPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 9 | 13.64% |
General Classification | 8 | 12.12% |
Speech Recognition | 4 | 6.06% |
Object Detection | 3 | 4.55% |
Semantic Segmentation | 3 | 4.55% |
Image Generation | 2 | 3.03% |
Graph Classification | 2 | 3.03% |
Object Recognition | 2 | 3.03% |
Face Verification | 2 | 3.03% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |