Initialization

# Xavier Initialization

Xavier Initialization, or Glorot Initialization, is an initialization scheme for neural networks. Biases are initialized be 0 and the weights $W_{ij}$ at each layer are initialized as:

$$W_{ij} \sim U\left[-\frac{1}{\sqrt{n}}, \frac{1}{\sqrt{n}}\right]$$

Where $U$ is a uniform distribution and $n$ is the size of the previous layer (number of columns in $W$).

#### Papers

Paper Code Results Date Stars