Search Results for author: Isao Ishikawa

Found 16 papers, 2 papers with code

Fully-Connected Network on Noncompact Symmetric Space and Ridgelet Transform based on Helgason-Fourier Analysis

no code implementations3 Mar 2022 Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

Based on the well-established framework of the Helgason-Fourier transform on the noncompact symmetric space, we present a fully-connected network and its associated ridgelet transform on the noncompact symmetric space, covering the hyperbolic neural network (HNN) and the SPDNet as special cases.

Koopman Spectrum Nonlinear Regulator and Provably Efficient Online Learning

1 code implementation30 Jun 2021 Motoya Ohnishi, Isao Ishikawa, Kendall Lowrey, Masahiro Ikeda, Sham Kakade, Yoshinobu Kawahara

In this work, we present a novel paradigm of controlling nonlinear systems via the minimization of the Koopman spectrum cost: a cost over the Koopman operator of the controlled dynamics.

online learning reinforcement-learning

Ghosts in Neural Networks: Existence, Structure and Role of Infinite-Dimensional Null Space

no code implementations9 Jun 2021 Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

In this paper, we present a structure theorem of the null space for a general class of neural networks.

Reproducing kernel Hilbert C*-module and kernel mean embeddings

no code implementations27 Jan 2021 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).

Universal Approximation Property of Neural Ordinary Differential Equations

no code implementations4 Dec 2020 Takeshi Teshima, Koichi Tojo, Masahiro Ikeda, Isao Ishikawa, Kenta Oono

Neural ordinary differential equations (NODEs) is an invertible neural network architecture promising for its free-form Jacobian and the availability of a tractable Jacobian determinant estimator.

A global universality of two-layer neural networks with ReLU activations

no code implementations20 Nov 2020 Naoya Hatano, Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano

In the present study, we investigate a universality of neural networks, which concerns a density of the set of two-layer neural networks in a function spaces.

Kernel Mean Embeddings of Von Neumann-Algebra-Valued Measures

no code implementations29 Jul 2020 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Yoshinobu Kawahara

Kernel mean embedding (KME) is a powerful tool to analyze probability measures for data, where the measures are conventionally embedded into a reproducing kernel Hilbert space (RKHS).

Ridge Regression with Over-Parametrized Two-Layer Networks Converge to Ridgelet Spectrum

no code implementations7 Jul 2020 Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

We develop a new theory of ridgelet transform, a wavelet-like integral transform that provides a powerful and general framework for the theoretical study of neural networks involving not only the ReLU but general activation functions.

Numerical Integration

Coupling-based Invertible Neural Networks Are Universal Diffeomorphism Approximators

no code implementations NeurIPS 2020 Takeshi Teshima, Isao Ishikawa, Koichi Tojo, Kenta Oono, Masahiro Ikeda, Masashi Sugiyama

We answer this question by showing a convenient criterion: a CF-INN is universal if its layers contain affine coupling and invertible linear functions as special cases.

Image Generation Representation Learning

Analysis via Orthonormal Systems in Reproducing Kernel Hilbert $C^*$-Modules and Applications

no code implementations2 Mar 2020 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).

Composition operators on reproducing kernel Hilbert spaces with analytic positive definite functions

no code implementations27 Nov 2019 Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano

In this paper, we specify what functions induce the bounded composition operators on a reproducing kernel Hilbert space (RKHS) associated with an analytic positive definite function defined on $\mathbf{R}^d$.

Krylov Subspace Method for Nonlinear Dynamical Systems with Random Noise

no code implementations9 Sep 2019 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Yoichi Matsuo, Yoshinobu Kawahara

In this paper, we address a lifted representation of nonlinear dynamical systems with random noise based on transfer operators, and develop a novel Krylov subspace method for estimating the operators using finite data, with consideration of the unboundedness of operators.

Anomaly Detection

Metric on random dynamical systems with vector-valued reproducing kernel Hilbert spaces

no code implementations17 Jun 2019 Isao Ishikawa, Akinori Tanaka, Masahiro Ikeda, Yoshinobu Kawahara

We empirically illustrate our metric with synthetic data, and evaluate it in the context of the independence test for random processes.

The global optimum of shallow neural network is attained by ridgelet transform

no code implementations19 May 2018 Sho Sonoda, Isao Ishikawa, Masahiro Ikeda, Kei Hagihara, Yoshihiro Sawano, Takuo Matsubara, Noboru Murata

We prove that the global minimum of the backpropagation (BP) training problem of neural networks with an arbitrary nonlinear activation is given by the ridgelet transform.

Cannot find the paper you are looking for? You can Submit a new open access paper.