4 code implementations • ICLR 2022 • Yulun Wu, Mikaela Cashman, Nicholas Choma, Érica T. Prates, Verónica G. Melesse Vergara, Manesh Shah, Andrew Chen, Austin Clyde, Thomas S. Brettin, Wibe A. de Jong, Neeraj Kumar, Martha S. Head, Rick L. Stevens, Peter Nugent, Daniel A. Jacobson, James B. Brown
We developed Distilled Graph Attention Policy Network (DGAPN), a reinforcement learning model to generate novel graph-structured chemical representations that optimize user-defined objectives by efficiently navigating a physically constrained domain.
2 code implementations • 11 Mar 2021 • Xiangyang Ju, Daniel Murnane, Paolo Calafiura, Nicholas Choma, Sean Conlon, Steve Farrell, Yaoyuan Xu, Maria Spiropulu, Jean-Roch Vlimant, Adam Aurisano, Jeremy Hewes, Giuseppe Cerati, Lindsey Gray, Thomas Klijnsma, Jim Kowalkowski, Markus Atkinson, Mark Neubauer, Gage DeZoort, Savannah Thais, Aditi Chauhan, Alex Schuy, Shih-Chieh Hsu, Alex Ballow, and Alina Lazar
The Exa. TrkX project has applied geometric learning concepts such as metric learning and graph neural networks to HEP particle tracking.
1 code implementation • 17 Sep 2018 • Nicholas Choma, Federico Monti, Lisa Gerhardt, Tomasz Palczewski, Zahra Ronaghi, Prabhat, Wahid Bhimji, Michael M. Bronstein, Spencer R. Klein, Joan Bruna
Tasks involving the analysis of geometric (graph- and manifold-structured) data have recently gained prominence in the machine learning community, giving birth to a rapidly developing field of geometric deep learning.
no code implementations • 30 Jun 2020 • Nicholas Choma, Daniel Murnane, Xiangyang Ju, Paolo Calafiura, Sean Conlon, Steven Farrell, Prabhat, Giuseppe Cerati, Lindsey Gray, Thomas Klijnsma, Jim Kowalkowski, Panagiotis Spentzouris, Jean-Roch Vlimant, Maria Spiropulu, Adam Aurisano, Jeremy Hewes, Aristeidis Tsaris, Kazuhiro Terao, Tracy Usher
Detector information can be associated with nodes and edges, enabling a GNN to propagate the embedded parameters around the graph and predict node-, edge- and graph-level observables.
no code implementations • 21 Oct 2022 • Xiangyang Ju, Yunsong Wang, Daniel Murnane, Nicholas Choma, Steven Farrell, Paolo Calafiura
Many artificial intelligence (AI) devices have been developed to accelerate the training and inference of neural networks models.