no code implementations • 25 Mar 2024 • Samuel Chun-Hei Lam, Justin Sirignano, Ziheng Wang
Then, using a Poisson equation, we prove that the fluctuations of the model updates around the limit distribution due to the randomly-arriving data samples vanish as the number of parameter updates $\rightarrow \infty$.
1 code implementation • 28 Aug 2023 • Samuel Chun-Hei Lam, Justin Sirignano, Konstantinos Spiliopoulos
Mathematical methods are developed to characterize the asymptotics of recurrent neural networks (RNN) as the number of hidden units, data samples in the sequence, hidden state updates, and training steps simultaneously grow to infinity.