Search Results for author: Shoichiro Yamaguchi

Found 8 papers, 2 papers with code

Virtual Human Generative Model: Masked Modeling Approach for Learning Human Characteristics

no code implementations19 Jun 2023 Kenta Oono, Nontawat Charoenphakdee, Kotatsu Bito, Zhengyan Gao, Yoshiaki Ota, Shoichiro Yamaguchi, Yohei Sugawara, Shin-ichi Maeda, Kunihiko Miyoshi, Yuki Saito, Koki Tsuda, Hiroshi Maruyama, Kohei Hayashi

In this paper, we propose Virtual Human Generative Model (VHGM), a machine learning model for estimating attributes about healthcare, lifestyles, and personalities.

When is invariance useful in an Out-of-Distribution Generalization problem ?

1 code implementation4 Aug 2020 Masanori Koyama, Shoichiro Yamaguchi

Popular approaches in this field use the hypothesis that such a predictor shall be an \textit{invariant predictor} that captures the mechanism that remains constant across environments.

Out-of-Distribution Generalization

MANGA: Method Agnostic Neural-policy Generalization and Adaptation

no code implementations19 Nov 2019 Homanga Bharadhwaj, Shoichiro Yamaguchi, Shin-ichi Maeda

Efficiently transferring learned policies to an unknown environment with changes in dynamics configurations in the presence of motor noise is very important for operating robots in the real world, and our work is a novel attempt in that direction.

Imitation Learning Reinforcement Learning (RL)

Motion Generation Considering Situation with Conditional Generative Adversarial Networks for Throwing Robots

no code implementations8 Oct 2019 Kyo Kutsuzawa, Hitoshi Kusano, Ayaka Kume, Shoichiro Yamaguchi

The appropriate motions can be found efficiently by searching the latent space of the trained cGANs instead of the motion space, while avoiding poor local optima.

valid

DISTRIBUTIONAL CONCAVITY REGULARIZATION FOR GANS

no code implementations ICLR 2019 Shoichiro Yamaguchi, Masanori Koyama

We propose Distributional Concavity (DC) regularization for Generative Adversarial Networks (GANs), a functional gradient-based method that promotes the entropy of the generator distribution and works against mode collapse.

A Wrapped Normal Distribution on Hyperbolic Space for Gradient-Based Learning

1 code implementation8 Feb 2019 Yoshihiro Nagano, Shoichiro Yamaguchi, Yasuhiro Fujita, Masanori Koyama

Hyperbolic space is a geometry that is known to be well-suited for representation learning of data with an underlying hierarchical structure.

Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.