Search Results for author: Yuka Hashimoto

Found 13 papers, 2 papers with code

Quantum Circuit $C^*$-algebra Net

no code implementations9 Apr 2024 Yuka Hashimoto, Ryuichiro Hataya

This interaction enables the circuits to share information among them, which contributes to improved generalization performance in machine learning tasks.

Image Classification Quantum Machine Learning

Koopman operators with intrinsic observables in rigged reproducing kernel Hilbert spaces

1 code implementation4 Mar 2024 Isao Ishikawa, Yuka Hashimoto, Masahiro Ikeda, Yoshinobu Kawahara

This paper presents a novel approach for estimating the Koopman operator defined on a reproducing kernel Hilbert space (RKHS) and its spectra.

$C^*$-Algebraic Machine Learning: Moving in a New Direction

no code implementations4 Feb 2024 Yuka Hashimoto, Masahiro Ikeda, Hachem Kadri

Machine learning has a long collaborative tradition with several fields of mathematics, such as statistics, probability and linear algebra.

Deep Ridgelet Transform: Voice with Koopman Operator Proves Universality of Formal Deep Networks

no code implementations5 Oct 2023 Sho Sonoda, Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda

We identify hidden layers inside a deep neural network (DNN) with group actions on the data domain, and formulate a formal deep network as a dual voice transform with respect to the Koopman operator, a linear representation of the group action.

LEMMA

Koopman-based generalization bound: New aspect for full-rank weights

no code implementations12 Feb 2023 Yuka Hashimoto, Sho Sonoda, Isao Ishikawa, Atsushi Nitanda, Taiji Suzuki

Our bound is tighter than existing norm-based bounds when the condition numbers of weight matrices are small.

Learning in RKHM: a $C^*$-Algebraic Twist for Kernel Machines

no code implementations21 Oct 2022 Yuka Hashimoto, Masahiro Ikeda, Hachem Kadri

Supervised learning in reproducing kernel Hilbert space (RKHS) and vector-valued RKHS (vvRKHS) has been investigated for more than 30 years.

$C^*$-algebra Net: A New Approach Generalizing Neural Network Parameters to $C^*$-algebra

no code implementations20 Jun 2022 Yuka Hashimoto, Zhao Wang, Tomoko Matsui

We apply our framework to practical problems such as density estimation and few-shot learning and show that our framework enables us to learn features of data even with a limited number of samples.

Density Estimation Few-Shot Learning

Reproducing kernel Hilbert C*-module and kernel mean embeddings

no code implementations27 Jan 2021 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).

Kernel Mean Embeddings of Von Neumann-Algebra-Valued Measures

no code implementations29 Jul 2020 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Yoshinobu Kawahara

Kernel mean embedding (KME) is a powerful tool to analyze probability measures for data, where the measures are conventionally embedded into a reproducing kernel Hilbert space (RKHS).

Analysis via Orthonormal Systems in Reproducing Kernel Hilbert $C^*$-Modules and Applications

no code implementations2 Mar 2020 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS).

Krylov Subspace Method for Nonlinear Dynamical Systems with Random Noise

no code implementations9 Sep 2019 Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Yoichi Matsuo, Yoshinobu Kawahara

In this paper, we address a lifted representation of nonlinear dynamical systems with random noise based on transfer operators, and develop a novel Krylov subspace method for estimating the operators using finite data, with consideration of the unboundedness of operators.

Anomaly Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.