no code implementations • 19 Jun 2023 • Kenta Oono, Nontawat Charoenphakdee, Kotatsu Bito, Zhengyan Gao, Yoshiaki Ota, Shoichiro Yamaguchi, Yohei Sugawara, Shin-ichi Maeda, Kunihiko Miyoshi, Yuki Saito, Koki Tsuda, Hiroshi Maruyama, Kohei Hayashi
In this paper, we propose Virtual Human Generative Model (VHGM), a machine learning model for estimating attributes about healthcare, lifestyles, and personalities.
1 code implementation • 4 Aug 2020 • Masanori Koyama, Shoichiro Yamaguchi
Popular approaches in this field use the hypothesis that such a predictor shall be an \textit{invariant predictor} that captures the mechanism that remains constant across environments.
no code implementations • 19 Nov 2019 • Homanga Bharadhwaj, Shoichiro Yamaguchi, Shin-ichi Maeda
Efficiently transferring learned policies to an unknown environment with changes in dynamics configurations in the presence of motor noise is very important for operating robots in the real world, and our work is a novel attempt in that direction.
no code implementations • 8 Oct 2019 • Kyo Kutsuzawa, Hitoshi Kusano, Ayaka Kume, Shoichiro Yamaguchi
The appropriate motions can be found efficiently by searching the latent space of the trained cGANs instead of the motion space, while avoiding poor local optima.
no code implementations • ICLR Workshop LLD 2019 • Takuya Shimada, Shoichiro Yamaguchi, Kohei Hayashi, Sosuke Kobayashi
Data augmentation by mixing samples, such as Mixup, has widely been used typically for classification tasks.
no code implementations • NeurIPS 2019 • Kenji Fukumizu, Shoichiro Yamaguchi, Yoh-ichi Mototake, Mirai Tanaka
We theoretically study the landscape of the training error for neural networks in overparameterized cases.
no code implementations • ICLR 2019 • Shoichiro Yamaguchi, Masanori Koyama
We propose Distributional Concavity (DC) regularization for Generative Adversarial Networks (GANs), a functional gradient-based method that promotes the entropy of the generator distribution and works against mode collapse.
1 code implementation • 8 Feb 2019 • Yoshihiro Nagano, Shoichiro Yamaguchi, Yasuhiro Fujita, Masanori Koyama
Hyperbolic space is a geometry that is known to be well-suited for representation learning of data with an underlying hierarchical structure.