no code implementations • 5 Feb 2024 • Nathaniel Hudson, J. Gregory Pauloski, Matt Baughman, Alok Kamatar, Mansi Sakarvadia, Logan Ward, Ryan Chard, André Bauer, Maksim Levental, Wenyi Wang, Will Engler, Owen Price Skelly, Ben Blaiszik, Rick Stevens, Kyle Chard, Ian Foster
Deep learning methods are transforming research, enabling new techniques, and ultimately leading to new discoveries.
1 code implementation • 25 Oct 2023 • Mansi Sakarvadia, Arham Khan, Aswathy Ajith, Daniel Grzenda, Nathaniel Hudson, André Bauer, Kyle Chard, Ian Foster
Transformer-based Large Language Models (LLMs) are the state-of-the-art for natural language tasks.
1 code implementation • 11 Sep 2023 • Mansi Sakarvadia, Aswathy Ajith, Arham Khan, Daniel Grzenda, Nathaniel Hudson, André Bauer, Kyle Chard, Ian Foster
Answering multi-hop reasoning questions requires retrieving and synthesizing information from diverse sources.
no code implementations • 28 Aug 2023 • Samir Rajani, Dario Dematties, Nathaniel Hudson, Kyle Chard, Nicola Ferrier, Rajesh Sankaran, Peter Beckman
Despite this, recent works have demonstrated that data reconstruction can be done with the locally trained model updates which are communicated across the network.
no code implementations • 28 Apr 2023 • Omer Rana, Theodoros Spyridopoulos, Nathaniel Hudson, Matt Baughman, Kyle Chard, Ian Foster, Aftab Khan
Hierarchical Federated Learning is likely to be a key enabler for a wide range of applications, such as smart farming and smart energy management, as it can improve performance and reduce costs, whilst also enabling FL workflows to be deployed in environments that are not well-suited to traditional FL.