Search Results for author: Pedram Zamirai

Found 2 papers, 0 papers with code

Revisiting BFfloat16 Training

no code implementations1 Jan 2021 Pedram Zamirai, Jian Zhang, Christopher R Aberger, Christopher De Sa

We ask can we do pure 16-bit training which requires only 16-bit compute units, while still matching the model accuracy attained by 32-bit training.

Revisiting BFloat16 Training

no code implementations13 Oct 2020 Pedram Zamirai, Jian Zhang, Christopher R. Aberger, Christopher De Sa

State-of-the-art generic low-precision training algorithms use a mix of 16-bit and 32-bit precision, creating the folklore that 16-bit hardware compute units alone are not enough to maximize model accuracy.

Cannot find the paper you are looking for? You can Submit a new open access paper.