Search Results for author: Satoki Ishikawa

Found 2 papers, 1 papers with code

On the Parameterization of Second-Order Optimization Effective Towards the Infinite Width

no code implementations19 Dec 2023 Satoki Ishikawa, Ryo Karakida

Second-order optimization has been developed to accelerate the training of deep neural networks and it is being applied to increasingly larger-scale models.

ASDL: A Unified Interface for Gradient Preconditioning in PyTorch

2 code implementations8 May 2023 Kazuki Osawa, Satoki Ishikawa, Rio Yokota, Shigang Li, Torsten Hoefler

Gradient preconditioning is a key technique to integrate the second-order information into gradients for improving and extending gradient-based learning algorithms.

Cannot find the paper you are looking for? You can Submit a new open access paper.