Search Results for author: Pongsakorn U-chupala

Found 2 papers, 1 papers with code

Cuttlefish: Low-Rank Model Training without All the Tuning

1 code implementation4 May 2023 Hongyi Wang, Saurabh Agarwal, Pongsakorn U-chupala, Yoshiki Tanaka, Eric P. Xing, Dimitris Papailiopoulos

Cuttlefish leverages the observation that after a few epochs of full-rank training, the stable rank (i. e., an approximation of the true rank) of each layer stabilizes at a constant value.

Massively Distributed SGD: ImageNet/ResNet-50 Training in a Flash

no code implementations13 Nov 2018 Hiroaki Mikami, Hisahiro Suganuma, Pongsakorn U-chupala, Yoshiki Tanaka, Yuichi Kageyama

Scaling the distributed deep learning to a massive GPU cluster level is challenging due to the instability of the large mini-batch training and the overhead of the gradient synchronization.

Cannot find the paper you are looking for? You can Submit a new open access paper.