no code implementations • 9 Apr 2024 • Yuka Hashimoto, Ryuichiro Hataya
This interaction enables the circuits to share information among them, which contributes to improved generalization performance in machine learning tasks.
1 code implementation • 4 Mar 2024 • Isao Ishikawa, Yuka Hashimoto, Masahiro Ikeda, Yoshinobu Kawahara
This paper presents a novel approach for estimating the Koopman operator defined on a reproducing kernel Hilbert space (RKHS) and its spectra.
no code implementations • 4 Feb 2024 • Yuka Hashimoto, Masahiro Ikeda, Hachem Kadri
Machine learning has a long collaborative tradition with several fields of mathematics, such as statistics, probability and linear algebra.
no code implementations • 5 Oct 2023 • Sho Sonoda, Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda
We identify hidden layers inside a deep neural network (DNN) with group actions on the data domain, and formulate a formal deep network as a dual voice transform with respect to the Koopman operator, a linear representation of the group action.
no code implementations • 12 Feb 2023 • Yuka Hashimoto, Sho Sonoda, Isao Ishikawa, Atsushi Nitanda, Taiji Suzuki
Our bound is tighter than existing norm-based bounds when the condition numbers of weight matrices are small.
no code implementations • 26 Jan 2023 • Ryuichiro Hataya, Yuka Hashimoto
We propose a new generalization of neural networks with noncommutative $C^*$-algebra.
no code implementations • 21 Oct 2022 • Yuka Hashimoto, Masahiro Ikeda, Hachem Kadri
Supervised learning in reproducing kernel Hilbert space (RKHS) and vector-valued RKHS (vvRKHS) has been investigated for more than 30 years.
no code implementations • 20 Jun 2022 • Yuka Hashimoto, Zhao Wang, Tomoko Matsui
We apply our framework to practical problems such as density estimation and few-shot learning and show that our framework enables us to learn features of data even with a limited number of samples.
no code implementations • 27 Jan 2021 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara
Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).
no code implementations • 29 Jul 2020 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Yoshinobu Kawahara
Kernel mean embedding (KME) is a powerful tool to analyze probability measures for data, where the measures are conventionally embedded into a reproducing kernel Hilbert space (RKHS).
no code implementations • 2 Mar 2020 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara
Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).
no code implementations • 9 Sep 2019 • Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Yoichi Matsuo, Yoshinobu Kawahara
In this paper, we address a lifted representation of nonlinear dynamical systems with random noise based on transfer operators, and develop a novel Krylov subspace method for estimating the operators using finite data, with consideration of the unboundedness of operators.
2 code implementations • NeurIPS 2018 • Isao Ishikawa, Keisuke Fujii, Masahiro Ikeda, Yuka Hashimoto, Yoshinobu Kawahara
The development of a metric for structural data is a long-term problem in pattern recognition and machine learning.