Search Results for author: Hiroaki Mikami

Found 3 papers, 1 papers with code

A Scaling Law for Syn-to-Real Transfer: How Much Is Your Pre-training Effective?

no code implementations29 Sep 2021 Hiroaki Mikami, Kenji Fukumizu, Shogo Murai, Shuji Suzuki, Yuta Kikuchi, Taiji Suzuki, Shin-ichi Maeda, Kohei Hayashi

Synthetic-to-real transfer learning is a framework in which a synthetically generated dataset is used to pre-train a model to improve its performance on real vision tasks.

Image Generation Transfer Learning

A Scaling Law for Synthetic-to-Real Transfer: How Much Is Your Pre-training Effective?

1 code implementation25 Aug 2021 Hiroaki Mikami, Kenji Fukumizu, Shogo Murai, Shuji Suzuki, Yuta Kikuchi, Taiji Suzuki, Shin-ichi Maeda, Kohei Hayashi

Synthetic-to-real transfer learning is a framework in which a synthetically generated dataset is used to pre-train a model to improve its performance on real vision tasks.

Image Generation Transfer Learning

Massively Distributed SGD: ImageNet/ResNet-50 Training in a Flash

no code implementations13 Nov 2018 Hiroaki Mikami, Hisahiro Suganuma, Pongsakorn U-chupala, Yoshiki Tanaka, Yuichi Kageyama

Scaling the distributed deep learning to a massive GPU cluster level is challenging due to the instability of the large mini-batch training and the overhead of the gradient synchronization.

Cannot find the paper you are looking for? You can Submit a new open access paper.