Initialization Bias of Fourier Neural Operator: Revisiting the Edge of Chaos

10 Oct 2023  ·  Takeshi Koshizuka, Masahiro Fujisawa, Yusuke Tanaka, Issei Sato ·

This paper investigates the initialization bias of the Fourier neural operator (FNO). A mean-field theory for FNO is established, analyzing the behavior of the random FNO from an \emph{edge of chaos} perspective. We uncover that the forward and backward propagation behaviors exhibit characteristics unique to FNO, induced by mode truncation, while also showcasing similarities to those of densely connected networks. Building upon this observation, we also propose an edge of chaos initialization scheme for FNO to mitigate the negative initialization bias leading to training instability. Experimental results show the effectiveness of our initialization scheme, enabling stable training of deep FNO without skip-connection.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here