Search Results for author: Guojing Cong

Found 9 papers, 0 papers with code

Optimizing Distributed Training on Frontier for Large Language Models

no code implementations20 Dec 2023 Sajal Dash, Isaac Lyngaas, Junqi Yin, Xiao Wang, Romain Egele, Guojing Cong, Feiyi Wang, Prasanna Balaprakash

For the training of the 175 Billion parameter model and the 1 Trillion parameter model, we achieved $100\%$ weak scaling efficiency on 1024 and 3072 MI250X GPUs, respectively.

Computational Efficiency

DeepSpeed4Science Initiative: Enabling Large-Scale Scientific Discovery through Sophisticated AI System Technologies

no code implementations6 Oct 2023 Shuaiwen Leon Song, Bonnie Kruft, Minjia Zhang, Conglong Li, Shiyang Chen, Chengming Zhang, Masahiro Tanaka, Xiaoxia Wu, Jeff Rasley, Ammar Ahmad Awan, Connor Holmes, Martin Cai, Adam Ghanem, Zhongzhu Zhou, Yuxiong He, Pete Luferenko, Divya Kumar, Jonathan Weyn, Ruixiong Zhang, Sylwester Klocek, Volodymyr Vragov, Mohammed AlQuraishi, Gustaf Ahdritz, Christina Floristean, Cristina Negri, Rao Kotamarthi, Venkatram Vishwanath, Arvind Ramanathan, Sam Foreman, Kyle Hippe, Troy Arcomano, Romit Maulik, Maxim Zvyagin, Alexander Brace, Bin Zhang, Cindy Orozco Bohorquez, Austin Clyde, Bharat Kale, Danilo Perez-Rivera, Heng Ma, Carla M. Mann, Michael Irvin, J. Gregory Pauloski, Logan Ward, Valerie Hayot, Murali Emani, Zhen Xie, Diangen Lin, Maulik Shukla, Ian Foster, James J. Davis, Michael E. Papka, Thomas Brettin, Prasanna Balaprakash, Gina Tourassi, John Gounley, Heidi Hanson, Thomas E Potok, Massimiliano Lupo Pasini, Kate Evans, Dan Lu, Dalton Lunga, Junqi Yin, Sajal Dash, Feiyi Wang, Mallikarjun Shankar, Isaac Lyngaas, Xiao Wang, Guojing Cong, Pei Zhang, Ming Fan, Siyan Liu, Adolfy Hoisie, Shinjae Yoo, Yihui Ren, William Tang, Kyle Felker, Alexey Svyatkovskiy, Hang Liu, Ashwin Aji, Angela Dalton, Michael Schulte, Karl Schulz, Yuntian Deng, Weili Nie, Josh Romero, Christian Dallago, Arash Vahdat, Chaowei Xiao, Thomas Gibbs, Anima Anandkumar, Rick Stevens

In the upcoming decade, deep learning may revolutionize the natural sciences, enhancing our capacity to model and predict natural occurrences.

Prediction of $\textrm{CO}_2$ Adsorption in Nano-Pores with Graph Neural Networks

no code implementations22 Aug 2022 Guojing Cong, Anshul Gupta, Rodrigo Neumann, Maira de Bayser, Mathias Steiner, Breanndán Ó Conchúir

We investigate the graph-based convolutional neural network approach for predicting and ranking gas adsorption properties of crystalline Metal-Organic Framework (MOF) adsorbents for application in post-combustion capture of $\textrm{CO}_2$.

AI-aided multiscale modeling of physiologically-significant blood clots

no code implementations25 May 2022 Yicong Zhu, Changnian Han, Peng Zhang, Guojing Cong, James R. Kozloski, Chih-Chieh Yang, Leili Zhang, Yuefan Deng

We have developed an AI-aided multiple time stepping (AI-MTS) algorithm and multiscale modeling framework (AI-MSM) and implemented them on the Summit-like supercomputer, AIMOS.

Accelerating Data Loading in Deep Neural Network Training

no code implementations2 Oct 2019 Chih-Chieh Yang, Guojing Cong

Our model suggests that I/O rate limits the scalability of distributed training, which inspires us to design a locality-aware data loading method.

A Distributed Hierarchical SGD Algorithm with Sparse Global Reduction

no code implementations12 Mar 2019 Fan Zhou, Guojing Cong

Reducing communication in training large-scale machine learning applications on distributed platform is still a big challenge.

Avg

On the convergence properties of a $K$-step averaging stochastic gradient descent algorithm for nonconvex optimization

no code implementations3 Aug 2017 Fan Zhou, Guojing Cong

We establish the convergence results of K-AVG for nonconvex objectives and explain why the K-step delay is necessary and leads to better performance than traditional parallel stochastic gradient descent which is a special case of K-AVG with $K=1$.

Avg

Cannot find the paper you are looking for? You can Submit a new open access paper.