Search Results for author: Kyle Chard

Found 22 papers, 9 papers with code

Combining Language and Graph Models for Semi-structured Information Extraction on the Web

no code implementations21 Feb 2024 Zhi Hong, Kyle Chard, Ian Foster

Relation extraction is an efficient way of mining the extraordinary wealth of human knowledge on the Web.

Language Modelling Relation +1

Comprehensive Exploration of Synthetic Data Generation: A Survey

no code implementations4 Jan 2024 André Bauer, Simon Trapp, Michael Stenger, Robert Leppich, Samuel Kounev, Mark Leznik, Kyle Chard, Ian Foster

This work surveys 417 Synthetic Data Generation (SDG) models over the last decade, providing a comprehensive overview of model types, functionality, and improvements.

Decision Making Model Selection +2

Adversarial Predictions of Data Distributions Across Federated Internet-of-Things Devices

no code implementations28 Aug 2023 Samir Rajani, Dario Dematties, Nathaniel Hudson, Kyle Chard, Nicola Ferrier, Rajesh Sankaran, Peter Beckman

Despite this, recent works have demonstrated that data reconstruction can be done with the locally trained model updates which are communicated across the network.

Federated Learning

Hierarchical and Decentralised Federated Learning

no code implementations28 Apr 2023 Omer Rana, Theodoros Spyridopoulos, Nathaniel Hudson, Matt Baughman, Kyle Chard, Ian Foster, Aftab Khan

Hierarchical Federated Learning is likely to be a key enabler for a wide range of applications, such as smart farming and smart energy management, as it can improve performance and reduce costs, whilst also enabling FL workflows to be deployed in environments that are not well-suited to traditional FL.

energy management Federated Learning

Cloud Services Enable Efficient AI-Guided Simulation Workflows across Heterogeneous Resources

2 code implementations15 Mar 2023 Logan Ward, J. Gregory Pauloski, Valerie Hayot-Sasson, Ryan Chard, Yadu Babuji, Ganesh Sivaraman, Sutanay Choudhury, Kyle Chard, Rajeev Thakur, Ian Foster

Applications that fuse machine learning and simulation can benefit from the use of multiple computing resources, with, for example, simulation codes running on highly parallel supercomputers and AI training and inference tasks on specialized accelerators.

Management

OpenHLS: High-Level Synthesis for Low-Latency Deep Neural Networks for Experimental Science

no code implementations13 Feb 2023 Maksim Levental, Arham Khan, Ryan Chard, Kazutomo Yoshii, Kyle Chard, Ian Foster

In many experiment-driven scientific domains, such as high-energy physics, material science, and cosmology, high data rate experiments impose hard constraints on data acquisition systems: collected data must either be indiscriminately stored for post-processing and analysis, thereby necessitating large storage capacity, or accurately filtered in real-time, thereby necessitating low-latency processing.

Low-latency processing

Globus Automation Services: Research process automation across the space-time continuum

no code implementations19 Aug 2022 Ryan Chard, Jim Pruyne, Kurt McKee, Josh Bryan, Brigitte Raumann, Rachana Ananthakrishnan, Kyle Chard, Ian Foster

We report here on new services within the Globus research data management platform that enable the specification of diverse research processes as reusable sets of actions, \emph{flows}, and the execution of such flows in heterogeneous research environments.

Management

FAIR principles for AI models with a practical application for accelerated high energy diffraction microscopy

1 code implementation1 Jul 2022 Nikil Ravi, Pranshu Chaturvedi, E. A. Huerta, Zhengchun Liu, Ryan Chard, Aristana Scourtas, K. J. Schmidt, Kyle Chard, Ben Blaiszik, Ian Foster

A concise and measurable set of FAIR (Findable, Accessible, Interoperable and Reusable) principles for scientific data is transforming the state-of-practice for data management and stewardship, supporting and enabling discovery and innovation.

Management

The Diminishing Returns of Masked Language Models to Science

no code implementations23 May 2022 Zhi Hong, Aswathy Ajith, Gregory Pauloski, Eamon Duede, Kyle Chard, Ian Foster

Transformer-based masked language models such as BERT, trained on general corpora, have shown impressive performance on downstream tasks.

Language Modelling

Ultrafast Focus Detection for Automated Microscopy

no code implementations26 Aug 2021 Maksim Levental, Ryan Chard, Kyle Chard, Ian Foster, Gregg A. Wildenberg

Technological advancements in modern scientific instruments, such as scanning electron microscopes (SEMs), have significantly increased data acquisition rates and image resolutions enabling new questions to be explored; however, the resulting data volumes and velocities, combined with automated experiments, are quickly overwhelming scientists as there remain crucial steps that require human intervention, for example reviewing image focus.

Semantic Segmentation

KAISA: An Adaptive Second-Order Optimizer Framework for Deep Neural Networks

3 code implementations4 Jul 2021 J. Gregory Pauloski, Qi Huang, Lei Huang, Shivaram Venkataraman, Kyle Chard, Ian Foster, Zhao Zhang

Kronecker-factored Approximate Curvature (K-FAC) has recently been shown to converge faster in deep neural network (DNN) training than stochastic gradient descent (SGD); however, K-FAC's larger memory footprint hinders its applicability to large models.

AI- and HPC-enabled Lead Generation for SARS-CoV-2: Models and Processes to Extract Druglike Molecules Contained in Natural Language Text

1 code implementation12 Jan 2021 Zhi Hong, J. Gregory Pauloski, Logan Ward, Kyle Chard, Ben Blaiszik, Ian Foster

Researchers worldwide are seeking to repurpose existing drugs or discover new drugs to counter the disease caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2).

named-entity-recognition Named Entity Recognition +1

Towards Online Steering of Flame Spray Pyrolysis Nanoparticle Synthesis

1 code implementation16 Oct 2020 Maksim Levental, Ryan Chard, Joseph A. Libera, Kyle Chard, Aarthi Koripelly, Jakob R. Elias, Marcus Schwarting, Ben Blaiszik, Marius Stan, Santanu Chaudhuri, Ian Foster

Flame Spray Pyrolysis (FSP) is a manufacturing technique to mass produce engineered nanoparticles for applications in catalysis, energy materials, composites, and more.

Targeting SARS-CoV-2 with AI- and HPC-enabled Lead Generation: A First Data Release

1 code implementation28 May 2020 Yadu Babuji, Ben Blaiszik, Tom Brettin, Kyle Chard, Ryan Chard, Austin Clyde, Ian Foster, Zhi Hong, Shantenu Jha, Zhuozhao Li, Xuefeng Liu, Arvind Ramanathan, Yi Ren, Nicholaus Saint, Marcus Schwarting, Rick Stevens, Hubertus van Dam, Rick Wagner

Researchers across the globe are seeking to rapidly repurpose existing drugs or discover new drugs to counter the the novel coronavirus disease (COVID-19) caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2).

BIG-bench Machine Learning

funcX: A Federated Function Serving Fabric for Science

no code implementations7 May 2020 Ryan Chard, Yadu Babuji, Zhuozhao Li, Tyler Skluzacek, Anna Woodard, Ben Blaiszik, Ian Foster, Kyle Chard

These new approaches must enable computation to be mobile, so that, for example, it can occur near data, be triggered by events (e. g., arrival of new data), be offloaded to specialized accelerators, or run remotely where resources are available.

Distributed, Parallel, and Cluster Computing

Deep Learning for Multi-Messenger Astrophysics: A Gateway for Discovery in the Big Data Era

no code implementations1 Feb 2019 Gabrielle Allen, Igor Andreoni, Etienne Bachelet, G. Bruce Berriman, Federica B. Bianco, Rahul Biswas, Matias Carrasco Kind, Kyle Chard, Minsik Cho, Philip S. Cowperthwaite, Zachariah B. Etienne, Daniel George, Tom Gibbs, Matthew Graham, William Gropp, Anushri Gupta, Roland Haas, E. A. Huerta, Elise Jennings, Daniel S. Katz, Asad Khan, Volodymyr Kindratenko, William T. C. Kramer, Xin Liu, Ashish Mahabal, Kenton McHenry, J. M. Miller, M. S. Neubauer, Steve Oberlin, Alexander R. Olivas Jr, Shawn Rosofsky, Milton Ruiz, Aaron Saxton, Bernard Schutz, Alex Schwing, Ed Seidel, Stuart L. Shapiro, Hongyu Shen, Yue Shen, Brigitta M. Sipőcz, Lunan Sun, John Towns, Antonios Tsokaros, Wei Wei, Jack Wells, Timothy J. Williams, JinJun Xiong, Zhizhen Zhao

We discuss key aspects to realize this endeavor, namely (i) the design and exploitation of scalable and computationally efficient AI algorithms for Multi-Messenger Astrophysics; (ii) cyberinfrastructure requirements to numerically simulate astrophysical sources, and to process and interpret Multi-Messenger Astrophysics data; (iii) management of gravitational wave detections and triggers to enable electromagnetic and astro-particle follow-ups; (iv) a vision to harness future developments of machine and deep learning and cyberinfrastructure resources to cope with the scale of discovery in the Big Data Era; (v) and the need to build a community that brings domain experts together with data scientists on equal footing to maximize and accelerate discovery in the nascent field of Multi-Messenger Astrophysics.

Astronomy Management

DLHub: Model and Data Serving for Science

no code implementations27 Nov 2018 Ryan Chard, Zhuozhao Li, Kyle Chard, Logan Ward, Yadu Babuji, Anna Woodard, Steve Tuecke, Ben Blaiszik, Michael J. Franklin, Ian Foster

Here we present the Data and Learning Hub for science (DLHub), a multi-tenant system that provides both model repository and serving capabilities with a focus on science applications.

Distributed Computing

Cannot find the paper you are looking for? You can Submit a new open access paper.