Initialization

Xavier Initialization

Xavier Initialization, or Glorot Initialization, is an initialization scheme for neural networks. Biases are initialized be 0 and the weights $W_{ij}$ at each layer are initialized as:

$$ W_{ij} \sim U\left[-\frac{1}{\sqrt{n}}, \frac{1}{\sqrt{n}}\right] $$

Where $U$ is a uniform distribution and $n$ is the size of the previous layer (number of columns in $W$).

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories