no code implementations • 5 Feb 2024 • Yuji Kawamata, Ryoki Motai, Yukihiko Okada, Akira Imakura, Tetsuya Sakurai
Second, our method enables collaborative estimation between different parties as well as multiple time points because the dimensionality-reduced intermediate representations can be accumulated.
no code implementations • 25 Oct 2023 • Dai Hai Nguyen, Tetsuya Sakurai, Hiroshi Mamitsuka
Notably, the optimization techniques, namely black-box VI and natural-gradient VI, can be reinterpreted as specific instances of the proposed Wasserstein gradient descent.
no code implementations • 1 Aug 2023 • Akihiro Mizoguchi, Anna Bogdanova, Akira Imakura, Tetsuya Sakurai
However, federated learning is encumbered by low accuracy in not identically and independently distributed (non-IID) settings, i. e., data partitioning has a large label bias, and is considered unsuitable for compound datasets, which tend to have large label bias.
1 code implementation • 31 Jul 2023 • Dai Hai Nguyen, Tetsuya Sakurai
We consider a general optimization problem of minimizing a composite objective functional defined over a class of probability distributions.
no code implementations • 6 Dec 2022 • Anna Bogdanova, Akira Imakura, Tetsuya Sakurai, Tomoya Fujii, Teppei Sakamoto, Hiroyuki Abe
Transparency of Machine Learning models used for decision support in various industries becomes essential for ensuring their ethical use.
1 code implementation • 15 Nov 2022 • Meng Huang, Jiangtao Ma, Changzhou Long, Junpeng Zhang, Xiucai Ye, Tetsuya Sakurai
However, to analyze lncRNA regulation regarding individual cells, we focus on single-cell RNA-sequencing (scRNA-seq) data instead of bulk data.
1 code implementation • 15 Nov 2022 • Meng Huang, Xiucai Ye, Tetsuya Sakurai
In this paper, to unveil interpretable development-specific gene signatures in human PFC, we propose a novel gene selection method, named Interpretable Causality Gene Selection (ICGS), which adopts a Bayesian Network (BN) to represent causality between multiple gene variables and a development variable.
1 code implementation • 8 Sep 2022 • Yifan He, Claus Aranha, Tetsuya Sakurai
We compare the proposed method with PushGP, as well as a method using subprograms manually extracted by a human.
no code implementations • 31 Aug 2022 • Akira Imakura, Tetsuya Sakurai, Yukihiko Okada, Tomoya Fujii, Teppei Sakamoto, Hiroyuki Abe
This study then proposes a non-readily identifiable DC analysis only sharing non-readily identifiable data for multiple medical datasets including personal information.
no code implementations • 26 Aug 2022 • Akira Imakura, Masateru Kihira, Yukihiko Okada, Tetsuya Sakurai
DC analysis centralizes individually constructed dimensionality-reduced intermediate representations and realizes integrated analysis via collaboration representations without sharing the original data.
no code implementations • 16 Aug 2022 • Yuji Kawamata, Ryoki Motai, Yukihiko Okada, Akira Imakura, Tetsuya Sakurai
Many existing methods for distributed data focus on resolving the lack of subjects (samples) and can only reduce random errors in estimating treatment effects.
no code implementations • 1 Aug 2022 • Dai Hai Nguyen, Tetsuya Sakurai
We consider the optimization problem of minimizing an objective functional, which admits a variational form and is defined over probability distributions on the constrained domain, which poses challenges to both theoretical analysis and algorithmic design.
1 code implementation • 18 Jun 2021 • Hongmin Li, Xiucai Ye, Akira Imakura, Tetsuya Sakurai
In LSEC, a large-scale spectral clustering based efficient ensemble generation framework is designed to generate various base clusterings within a low computational complexity.
1 code implementation • 30 Apr 2021 • Hongmin Li, Xiucai Ye, Akira Imakura, Tetsuya Sakurai
In this paper, we propose a divide-and-conquer based large-scale spectral clustering method to strike a good balance between efficiency and effectiveness.
Ranked #2 on Image/Document Clustering on pendigits
no code implementations • 27 Jan 2021 • Akira Imakura, Anna Bogdanova, Takaya Yamazoe, Kazumasa Omote, Tetsuya Sakurai
Distributed data analysis without revealing the individual data has recently attracted significant attention in several applications.
1 code implementation • 20 Nov 2020 • Hongmin Li, Xiucai Ye, Akira Imakura, Tetsuya Sakurai
Instead of directly using the clustering results obtained from each base spectral clustering algorithm, the proposed method learns a robust presentation of graph Laplacian by ensemble learning from the spectral embedding of each base spectral clustering algorithm.
Ranked #1 on Image/Document Clustering on Wine
no code implementations • 13 Nov 2020 • Anna Bogdanova, Akie Nakai, Yukihiko Okada, Akira Imakura, Tetsuya Sakurai
Dimensionality Reduction is a commonly used element in a machine learning pipeline that helps to extract important features from high-dimensional data.
no code implementations • 9 Nov 2020 • Akira Imakura, Hiroaki Inaba, Yukihiko Okada, Tetsuya Sakurai
This paper proposes an interpretable non-model sharing collaborative data analysis method as one of the federated learning systems, which is an emerging technology to analyze distributed data.
no code implementations • 16 Oct 2019 • Momo Matsuda, Keiichi Morikuni, Akira Imakura, Xiucai Ye, Tetsuya Sakurai
Irregular features disrupt the desired classification.
no code implementations • 20 Feb 2019 • Akira Imakura, Tetsuya Sakurai
In this paper, we propose a data collaboration analysis method for distributed datasets.
no code implementations • 25 Dec 2018 • Yusei Miura, Tetsuya Sakurai, Claus Aranha, Toshiya Senda, Ryuichi Kato, Yusuke Yamada
We compared our crystallization image recognition method with a high precision method using Inception-V3.
no code implementations • 18 May 2018 • Momo Matsuda, Keiichi Morikuni, Tetsuya Sakurai
Spectral dimensionality reduction methods enable linear separations of complex data with high-dimensional features in a reduced space.
no code implementations • 16 May 2016 • Tetsuya Sakurai, Akira Imakura, Yuto Inoue, Yasunori Futamura
In this paper, we propose a novel approach for computing weight matrices of fully-connected DNNs by using two types of semi-nonnegative matrix factorizations (semi-NMFs).