Xavier Initialization

Xavier Initialization, or Glorot Initialization, is an initialization scheme for neural networks. Biases are initialized be 0 and the weights $W_{ij}$ at each layer are initialized as:

$$ W_{ij} \sim U\left[-\frac{\sqrt{6}}{\sqrt{fan_{in} + fan_{out}}}, \frac{\sqrt{6}}{\sqrt{fan_{in} + fan_{out}}}\right] $$

Where $U$ is a uniform distribution and $fan_{in}$ is the size of the previous layer (number of columns in $W$) and $fan_{out}$ is the size of the current layer.


Paper Code Results Date Stars


Task Papers Share
General Classification 15 9.87%
Object Detection 14 9.21%
Image Classification 14 9.21%
Classification 10 6.58%
Semantic Segmentation 6 3.95%
Quantization 4 2.63%
Test 4 2.63%
Face Recognition 3 1.97%
Face Verification 3 1.97%


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign