no code implementations • 20 Nov 2020 • Naoya Hatano, Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano
In the present study, we investigate a universality of neural networks, which concerns a density of the set of two-layer neural networks in a function spaces.
no code implementations • 27 Nov 2019 • Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano
In this paper, we specify what functions induce the bounded composition operators on a reproducing kernel Hilbert space (RKHS) associated with an analytic positive definite function defined on $\mathbf{R}^d$.
no code implementations • 19 May 2018 • Sho Sonoda, Isao Ishikawa, Masahiro Ikeda, Kei Hagihara, Yoshihiro Sawano, Takuo Matsubara, Noboru Murata
We prove that the global minimum of the backpropagation (BP) training problem of neural networks with an arbitrary nonlinear activation is given by the ridgelet transform.