no code implementations • 5 Feb 2024 • Yuji Kawamata, Ryoki Motai, Yukihiko Okada, Akira Imakura, Tetsuya Sakurai
Second, our method enables collaborative estimation between different parties as well as multiple time points because the dimensionality-reduced intermediate representations can be accumulated.
no code implementations • 1 Aug 2023 • Akihiro Mizoguchi, Anna Bogdanova, Akira Imakura, Tetsuya Sakurai
However, federated learning is encumbered by low accuracy in not identically and independently distributed (non-IID) settings, i. e., data partitioning has a large label bias, and is considered unsuitable for compound datasets, which tend to have large label bias.
no code implementations • 6 Dec 2022 • Anna Bogdanova, Akira Imakura, Tetsuya Sakurai, Tomoya Fujii, Teppei Sakamoto, Hiroyuki Abe
Transparency of Machine Learning models used for decision support in various industries becomes essential for ensuring their ethical use.
no code implementations • 31 Aug 2022 • Akira Imakura, Tetsuya Sakurai, Yukihiko Okada, Tomoya Fujii, Teppei Sakamoto, Hiroyuki Abe
This study then proposes a non-readily identifiable DC analysis only sharing non-readily identifiable data for multiple medical datasets including personal information.
no code implementations • 26 Aug 2022 • Akira Imakura, Masateru Kihira, Yukihiko Okada, Tetsuya Sakurai
DC analysis centralizes individually constructed dimensionality-reduced intermediate representations and realizes integrated analysis via collaboration representations without sharing the original data.
no code implementations • 16 Aug 2022 • Yuji Kawamata, Ryoki Motai, Yukihiko Okada, Akira Imakura, Tetsuya Sakurai
Many existing methods for distributed data focus on resolving the lack of subjects (samples) and can only reduce random errors in estimating treatment effects.
no code implementations • 27 Mar 2022 • Toyotaro Suzumura, Akiyoshi Sugiki, Hiroyuki Takizawa, Akira Imakura, Hiroshi Nakamura, Kenjiro Taura, Tomohiro Kudoh, Toshihiro Hanawa, Yuji Sekiya, Hiroki Kobayashi, Shin Matsushima, Yohei Kuga, Ryo Nakamura, Renhe Jiang, Junya Kawase, Masatoshi Hanai, Hiroshi Miyazaki, Tsutomu Ishizaki, Daisuke Shimotoku, Daisuke Miyamoto, Kento Aida, Atsuko Takefusa, Takashi Kurimoto, Koji Sasayama, Naoya Kitagawa, Ikki Fujiwara, Yusuke Tanimura, Takayuki Aoki, Toshio Endo, Satoshi Ohshima, Keiichiro Fukazawa, Susumu Date, Toshihiro Uchibayashi
The growing amount of data and advances in data science have created a need for a new kind of cloud platform that provides users with flexibility, strong security, and the ability to couple with supercomputers and edge devices through high-performance networks.
no code implementations • CVPR 2022 • Ryuki Yamamoto, Hidekata Hontani, Akira Imakura, Tatsuya Yokota
Tensor completion using multiway delay-embedding transform (MDT) (or Hankelization) suffers from the large memory requirement and high computational cost in spite of its high potentiality for the image modeling.
1 code implementation • 18 Jun 2021 • Hongmin Li, Xiucai Ye, Akira Imakura, Tetsuya Sakurai
In LSEC, a large-scale spectral clustering based efficient ensemble generation framework is designed to generate various base clusterings within a low computational complexity.
1 code implementation • 30 Apr 2021 • Hongmin Li, Xiucai Ye, Akira Imakura, Tetsuya Sakurai
In this paper, we propose a divide-and-conquer based large-scale spectral clustering method to strike a good balance between efficiency and effectiveness.
Ranked #2 on Image/Document Clustering on pendigits
no code implementations • 27 Jan 2021 • Akira Imakura, Anna Bogdanova, Takaya Yamazoe, Kazumasa Omote, Tetsuya Sakurai
Distributed data analysis without revealing the individual data has recently attracted significant attention in several applications.
1 code implementation • 20 Nov 2020 • Hongmin Li, Xiucai Ye, Akira Imakura, Tetsuya Sakurai
Instead of directly using the clustering results obtained from each base spectral clustering algorithm, the proposed method learns a robust presentation of graph Laplacian by ensemble learning from the spectral embedding of each base spectral clustering algorithm.
Ranked #1 on Image/Document Clustering on Wine
no code implementations • 13 Nov 2020 • Anna Bogdanova, Akie Nakai, Yukihiko Okada, Akira Imakura, Tetsuya Sakurai
Dimensionality Reduction is a commonly used element in a machine learning pipeline that helps to extract important features from high-dimensional data.
no code implementations • 9 Nov 2020 • Akira Imakura, Hiroaki Inaba, Yukihiko Okada, Tetsuya Sakurai
This paper proposes an interpretable non-model sharing collaborative data analysis method as one of the federated learning systems, which is an emerging technology to analyze distributed data.
no code implementations • 16 Oct 2019 • Momo Matsuda, Keiichi Morikuni, Akira Imakura, Xiucai Ye, Tetsuya Sakurai
Irregular features disrupt the desired classification.
no code implementations • 20 Feb 2019 • Akira Imakura, Tetsuya Sakurai
In this paper, we propose a data collaboration analysis method for distributed datasets.
no code implementations • 16 May 2016 • Tetsuya Sakurai, Akira Imakura, Yuto Inoue, Yasunori Futamura
In this paper, we propose a novel approach for computing weight matrices of fully-connected DNNs by using two types of semi-nonnegative matrix factorizations (semi-NMFs).