Xavier Initialization, or Glorot Initialization, is an initialization scheme for neural networks. Biases are initialized be 0 and the weights $W_{ij}$ at each layer are initialized as:
$$ W_{ij} \sim U\left[\frac{1}{\sqrt{n}}, \frac{1}{\sqrt{n}}\right] $$
Where $U$ is a uniform distribution and $n$ is the size of the previous layer (number of columns in $W$).
Paper  Code  Results  Date  Stars 

Task  Papers  Share 

General Classification  15  12.30% 
Object Detection  14  11.48% 
Image Classification  14  11.48% 
Semantic Segmentation  6  4.92% 
Quantization  4  3.28% 
Autonomous Driving  3  2.46% 
Specificity  3  2.46% 
Object Recognition  3  2.46% 
Video Summarization  2  1.64% 
Component  Type 


🤖 No Components Found  You can add them if they exist; e.g. Mask RCNN uses RoIAlign 