You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 15 Apr 2022 • Isao Ishikawa, Takeshi Teshima, Koichi Tojo, Kenta Oono, Masahiro Ikeda, Masashi Sugiyama

Invertible neural networks (INNs) are neural network architectures with invertibility by design.

no code implementations • 3 Mar 2022 • Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

Based on the well-established framework of the Helgason-Fourier transform on the noncompact symmetric space, we present a fully-connected network and its associated ridgelet transform on the noncompact symmetric space, covering the hyperbolic neural network (HNN) and the SPDNet as special cases.

1 code implementation • 30 Jun 2021 • Motoya Ohnishi, Isao Ishikawa, Kendall Lowrey, Masahiro Ikeda, Sham Kakade, Yoshinobu Kawahara

In this work, we present a novel paradigm of controlling nonlinear systems via the minimization of the Koopman spectrum cost: a cost over the Koopman operator of the controlled dynamics.

no code implementations • 9 Jun 2021 • Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

In this paper, we present a structure theorem of the null space for a general class of neural networks.

no code implementations • 27 Jan 2021 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).

no code implementations • 4 Dec 2020 • Takeshi Teshima, Koichi Tojo, Masahiro Ikeda, Isao Ishikawa, Kenta Oono

Neural ordinary differential equations (NODEs) is an invertible neural network architecture promising for its free-form Jacobian and the availability of a tractable Jacobian determinant estimator.

no code implementations • 20 Nov 2020 • Naoya Hatano, Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano

In the present study, we investigate a universality of neural networks, which concerns a density of the set of two-layer neural networks in a function spaces.

no code implementations • 29 Jul 2020 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Yoshinobu Kawahara

Kernel mean embedding (KME) is a powerful tool to analyze probability measures for data, where the measures are conventionally embedded into a reproducing kernel Hilbert space (RKHS).

no code implementations • 7 Jul 2020 • Sho Sonoda, Isao Ishikawa, Masahiro Ikeda

We develop a new theory of ridgelet transform, a wavelet-like integral transform that provides a powerful and general framework for the theoretical study of neural networks involving not only the ReLU but general activation functions.

no code implementations • NeurIPS 2020 • Takeshi Teshima, Isao Ishikawa, Koichi Tojo, Kenta Oono, Masahiro Ikeda, Masashi Sugiyama

We answer this question by showing a convenient criterion: a CF-INN is universal if its layers contain affine coupling and invertible linear functions as special cases.

no code implementations • 2 Mar 2020 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).

no code implementations • 27 Nov 2019 • Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano

In this paper, we specify what functions induce the bounded composition operators on a reproducing kernel Hilbert space (RKHS) associated with an analytic positive definite function defined on $\mathbf{R}^d$.

no code implementations • 9 Sep 2019 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Yoichi Matsuo, Yoshinobu Kawahara

In this paper, we address a lifted representation of nonlinear dynamical systems with random noise based on transfer operators, and develop a novel Krylov subspace method for estimating the operators using finite data, with consideration of the unboundedness of operators.

no code implementations • 17 Jun 2019 • Isao Ishikawa, Akinori Tanaka, Masahiro Ikeda, Yoshinobu Kawahara

We empirically illustrate our metric with synthetic data, and evaluate it in the context of the independence test for random processes.

2 code implementations • NeurIPS 2018 • Isao Ishikawa, Keisuke Fujii, Masahiro Ikeda, Yuka Hashimoto, Yoshinobu Kawahara

The development of a metric for structural data is a long-term problem in pattern recognition and machine learning.

no code implementations • 19 May 2018 • Sho Sonoda, Isao Ishikawa, Masahiro Ikeda, Kei Hagihara, Yoshihiro Sawano, Takuo Matsubara, Noboru Murata

We prove that the global minimum of the backpropagation (BP) training problem of neural networks with an arbitrary nonlinear activation is given by the ridgelet transform.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.