Search Results for author: Shota Harada

Found 5 papers, 1 papers with code

Cluster Entropy: Active Domain Adaptation in Pathological Image Segmentation

no code implementations26 Apr 2023 Xiaoqing Liu, Kengo Araki, Shota Harada, Akihiko Yoshizawa, Kazuhiro Terada, Mariyo Kurata, Naoki Nakajima, Hiroyuki Abe, Tetsuo Ushiku, Ryoma Bise

The domain shift in pathological segmentation is an important problem, where a network trained by a source domain (collected at a specific hospital) does not work well in the target domain (from different hospitals) due to the different image features.

Image Segmentation Semantic Segmentation +2

Cluster-Guided Semi-Supervised Domain Adaptation for Imbalanced Medical Image Classification

no code implementations2 Mar 2023 Shota Harada, Ryoma Bise, Kengo Araki, Akihiko Yoshizawa, Kazuhiro Terada, Mariyo Kurata, Naoki Nakajima, Hiroyuki Abe, Tetsuo Ushiku, Seiichi Uchida

Semi-supervised domain adaptation is a technique to build a classifier for a target domain by modifying a classifier in another (source) domain using many unlabeled samples and a small number of labeled samples from the target domain.

Clustering Domain Adaptation +4

Character-independent font identification

1 code implementation24 Jan 2020 Daichi Haraguchi, Shota Harada, Brian Kenji Iwana, Yuto Shinahara, Seiichi Uchida

Moreover, we analyzed the relationship between character classes and font identification accuracy.

Font Recognition

Biosignal Generation and Latent Variable Analysis with Recurrent Generative Adversarial Networks

no code implementations17 May 2019 Shota Harada, Hideaki Hayashi, Seiichi Uchida

GAN-based generative models only learn the projection between a random distribution as input data and the distribution of training data. Therefore, the relationship between input and generated data is unclear, and the characteristics of the data generated from this model cannot be controlled.

Data Augmentation Time Series +1

Cannot find the paper you are looking for? You can Submit a new open access paper.