Kaiming Initialization, or He Initialization, is an initialization method for neural networks that takes into account the nonlinearity of activation functions, such as ReLU activations.
A proper initialization method should avoid reducing or magnifying the magnitudes of input signals exponentially. Using a derivation they work out that the condition to stop this happening is:
$$\frac{1}{2}n_{l}\text{Var}\left[w_{l}\right] = 1 $$
This implies an initialization scheme of:
$$ w_{l} \sim \mathcal{N}\left(0, 2/n_{l}\right)$$
That is, a zerocentered Gaussian with standard deviation of $\sqrt{2/{n}_{l}}$ (variance shown in equation above). Biases are initialized at $0$.
Source: Delving Deep into Rectifiers: Surpassing HumanLevel Performance on ImageNet ClassificationPaper  Code  Results  Date  Stars 

Task  Papers  Share 

Image Classification  84  12.12% 
General Classification  72  10.39% 
Semantic Segmentation  41  5.92% 
Object Detection  32  4.62% 
SelfSupervised Learning  18  2.60% 
Quantization  14  2.02% 
Instance Segmentation  13  1.88% 
COVID19 Diagnosis  9  1.30% 
Time Series  8  1.15% 
Component  Type 


🤖 No Components Found  You can add them if they exist; e.g. Mask RCNN uses RoIAlign 