Search Results for author: Masahiro Ikeda

Found 25 papers, 4 papers with code

Koopman operators with intrinsic observables in rigged reproducing kernel Hilbert spaces

1 code implementation4 Mar 2024 Isao Ishikawa, Yuka Hashimoto, Masahiro Ikeda, Yoshinobu Kawahara

This paper presents a novel approach for estimating the Koopman operator defined on a reproducing kernel Hilbert space (RKHS) and its spectra.

A unified Fourier slice method to derive ridgelet transform for a variety of depth-2 neural networks

no code implementations25 Feb 2024 Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

For depth-2 fully-connected networks on a Euclidean space, the ridgelet transform has been discovered up to the closed-form expression, thus we could describe how the parameters are distributed.

$C^*$-Algebraic Machine Learning: Moving in a New Direction

no code implementations4 Feb 2024 Yuka Hashimoto, Masahiro Ikeda, Hachem Kadri

Machine learning has a long collaborative tradition with several fields of mathematics, such as statistics, probability and linear algebra.

Deep Ridgelet Transform: Voice with Koopman Operator Proves Universality of Formal Deep Networks

no code implementations5 Oct 2023 Sho Sonoda, Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda

We identify hidden layers inside a deep neural network (DNN) with group actions on the data domain, and formulate a formal deep network as a dual voice transform with respect to the Koopman operator, a linear representation of the group action.

LEMMA

Joint Group Invariant Functions on Data-Parameter Domain Induce Universal Neural Networks

no code implementations5 Oct 2023 Sho Sonoda, Hideyuki Ishi, Isao Ishikawa, Masahiro Ikeda

The symmetry and geometry of input data are considered to be encoded in the internal data representation inside the neural network, but the specific encoding rule has been less investigated.

LEMMA

Learning in RKHM: a $C^*$-Algebraic Twist for Kernel Machines

no code implementations21 Oct 2022 Yuka Hashimoto, Masahiro Ikeda, Hachem Kadri

Supervised learning in reproducing kernel Hilbert space (RKHS) and vector-valued RKHS (vvRKHS) has been investigated for more than 30 years.

Dynamic Structure Estimation from Bandit Feedback

no code implementations2 Jun 2022 Motoya Ohnishi, Isao Ishikawa, Yuko Kuroki, Masahiro Ikeda

This work present novel method for structure estimation of an underlying dynamical system.

Universality of Group Convolutional Neural Networks Based on Ridgelet Analysis on Groups

no code implementations30 May 2022 Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

We show the universality of depth-2 group convolutional neural networks (GCNNs) in a unified and constructive manner based on the ridgelet theory.

Fully-Connected Network on Noncompact Symmetric Space and Ridgelet Transform based on Helgason-Fourier Analysis

no code implementations3 Mar 2022 Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

Based on the well-established framework of the Helgason-Fourier transform on the noncompact symmetric space, we present a fully-connected network and its associated ridgelet transform on the noncompact symmetric space, covering the hyperbolic neural network (HNN) and the SPDNet as special cases.

Koopman Spectrum Nonlinear Regulator and Provably Efficient Online Learning

1 code implementation30 Jun 2021 Motoya Ohnishi, Isao Ishikawa, Kendall Lowrey, Masahiro Ikeda, Sham Kakade, Yoshinobu Kawahara

In this work, we present a novel paradigm of controlling nonlinear systems via the minimization of the Koopman spectrum cost: a cost over the Koopman operator of the controlled dynamics.

reinforcement-learning Reinforcement Learning (RL)

Ghosts in Neural Networks: Existence, Structure and Role of Infinite-Dimensional Null Space

no code implementations9 Jun 2021 Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

In this paper, we present a structure theorem of the null space for a general class of neural networks.

Reproducing kernel Hilbert C*-module and kernel mean embeddings

no code implementations27 Jan 2021 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).

Universal Approximation Property of Neural Ordinary Differential Equations

no code implementations4 Dec 2020 Takeshi Teshima, Koichi Tojo, Masahiro Ikeda, Isao Ishikawa, Kenta Oono

Neural ordinary differential equations (NODEs) is an invertible neural network architecture promising for its free-form Jacobian and the availability of a tractable Jacobian determinant estimator.

A global universality of two-layer neural networks with ReLU activations

no code implementations20 Nov 2020 Naoya Hatano, Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano

In the present study, we investigate a universality of neural networks, which concerns a density of the set of two-layer neural networks in a function spaces.

Vocal Bursts Valence Prediction

Kernel Mean Embeddings of Von Neumann-Algebra-Valued Measures

no code implementations29 Jul 2020 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Yoshinobu Kawahara

Kernel mean embedding (KME) is a powerful tool to analyze probability measures for data, where the measures are conventionally embedded into a reproducing kernel Hilbert space (RKHS).

Ridge Regression with Over-Parametrized Two-Layer Networks Converge to Ridgelet Spectrum

no code implementations7 Jul 2020 Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

We develop a new theory of ridgelet transform, a wavelet-like integral transform that provides a powerful and general framework for the theoretical study of neural networks involving not only the ReLU but general activation functions.

Inductive Bias Numerical Integration +1

Coupling-based Invertible Neural Networks Are Universal Diffeomorphism Approximators

no code implementations NeurIPS 2020 Takeshi Teshima, Isao Ishikawa, Koichi Tojo, Kenta Oono, Masahiro Ikeda, Masashi Sugiyama

We answer this question by showing a convenient criterion: a CF-INN is universal if its layers contain affine coupling and invertible linear functions as special cases.

Image Generation Representation Learning

Hypergraph Clustering Based on PageRank

1 code implementation15 Jun 2020 Yuuki Takai, Atsushi Miyauchi, Masahiro Ikeda, Yuichi Yoshida

For both algorithms, we discuss theoretical guarantees on the conductance of the output vertex set.

Clustering

Analysis via Orthonormal Systems in Reproducing Kernel Hilbert $C^*$-Modules and Applications

no code implementations2 Mar 2020 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).

Composition operators on reproducing kernel Hilbert spaces with analytic positive definite functions

no code implementations27 Nov 2019 Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano

In this paper, we specify what functions induce the bounded composition operators on a reproducing kernel Hilbert space (RKHS) associated with an analytic positive definite function defined on $\mathbf{R}^d$.

Krylov Subspace Method for Nonlinear Dynamical Systems with Random Noise

no code implementations9 Sep 2019 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Yoichi Matsuo, Yoshinobu Kawahara

In this paper, we address a lifted representation of nonlinear dynamical systems with random noise based on transfer operators, and develop a novel Krylov subspace method for estimating the operators using finite data, with consideration of the unboundedness of operators.

Anomaly Detection

Metric on random dynamical systems with vector-valued reproducing kernel Hilbert spaces

no code implementations17 Jun 2019 Isao Ishikawa, Akinori Tanaka, Masahiro Ikeda, Yoshinobu Kawahara

We empirically illustrate our metric with synthetic data, and evaluate it in the context of the independence test for random processes.

The global optimum of shallow neural network is attained by ridgelet transform

no code implementations19 May 2018 Sho Sonoda, Isao Ishikawa, Masahiro Ikeda, Kei Hagihara, Yoshihiro Sawano, Takuo Matsubara, Noboru Murata

We prove that the global minimum of the backpropagation (BP) training problem of neural networks with an arbitrary nonlinear activation is given by the ridgelet transform.

Cannot find the paper you are looking for? You can Submit a new open access paper.