no code implementations • 23 Jan 2024 • Daniel Nichols, Joshua H. Davis, Zhaojun Xie, Arjun Rajaram, Abhinav Bhatele
Large language models are increasingly becoming a popular tool for software development.
no code implementations • 29 Jun 2023 • Daniel Nichols, Aniruddha Marathe, Harshitha Menon, Todd Gamblin, Abhinav Bhatele
In this paper, we show how large language models (LLMs) can be applied to tasks specific to high performance and scientific codes.
no code implementations • 9 Nov 2021 • Daniel Nichols, Siddharth Singh, Shu-Huai Lin, Abhinav Bhatele
This phenomenon has spurred the development of algorithms for distributed training of neural networks over a larger number of hardware accelerators.
no code implementations • 23 Nov 2020 • Rick Archibald, Edmond Chow, Eduardo D'Azevedo, Jack Dongarra, Markus Eisenbach, Rocco Febbo, Florent Lopez, Daniel Nichols, Stanimire Tomov, Kwai Wong, Junqi Yin
This paper discusses the necessities of an HPC deep learning framework and how those needs can be provided (e. g., as in MagmaDNN) through a deep integration with existing HPC libraries, such as MAGMA and its modular memory management, MPI, CuBLAS, CuDNN, MKL, and HIP.