Search Results for author: Miles Cranmer

Found 28 papers, 22 papers with code

AstroCLIP: A Cross-Modal Foundation Model for Galaxies

1 code implementation4 Oct 2023 Liam Parker, Francois Lanusse, Siavash Golkar, Leopoldo Sarra, Miles Cranmer, Alberto Bietti, Michael Eickenberg, Geraud Krawezik, Michael McCabe, Ruben Ohana, Mariel Pettee, Bruno Regaldo-Saint Blancard, Tiberiu Tesileanu, Kyunghyun Cho, Shirley Ho

These embeddings can then be used - without any model fine-tuning - for a variety of downstream tasks including (1) accurate in-modality and cross-modality semantic similarity search, (2) photometric redshift estimation, (3) galaxy property estimation from both images and spectra, and (4) morphology classification.

Contrastive Learning Morphology classification +4

Reusability report: Prostate cancer stratification with diverse biologically-informed neural architectures

1 code implementation28 Sep 2023 Christian Pedersen, Tiberiu Tesileanu, Tinghui Wu, Siavash Golkar, Miles Cranmer, Zijun Zhang, Shirley Ho

This suggests that different neural architectures are sensitive to different aspects of the data, an important yet under-explored challenge for clinical prediction tasks.

Symbolic Regression on FPGAs for Fast Machine Learning Inference

no code implementations6 May 2023 Ho Fung Tsoi, Adrian Alan Pol, Vladimir Loncar, Ekaterina Govorkova, Miles Cranmer, Sridhara Dasu, Peter Elmer, Philip Harris, Isobel Ojalvo, Maurizio Pierini

The high-energy physics community is investigating the potential of deploying machine-learning-based solutions on Field-Programmable Gate Arrays (FPGAs) to enhance physics sensitivity while still meeting data processing time constraints.

Neural Architecture Search regression +1

Interpretable Machine Learning for Science with PySR and SymbolicRegression.jl

3 code implementations2 May 2023 Miles Cranmer

PySR was developed to democratize and popularize symbolic regression for the sciences, and is built on a high-performance distributed back-end, a flexible search algorithm, and interfaces with several deep learning packages.

Interpretable Machine Learning regression +1

Learning Integrable Dynamics with Action-Angle Networks

1 code implementation24 Nov 2022 Ameya Daigavane, Arthur Kosmala, Miles Cranmer, Tess Smidt, Shirley Ho

Here, we propose an alternative construction for learned physical simulators that are inspired by the concept of action-angle coordinates from classical mechanics for describing integrable systems.

Numerical Integration

Hierarchical Inference of the Lensing Convergence from Photometric Catalogs with Bayesian Graph Neural Networks

1 code implementation15 Nov 2022 Ji Won Park, Simon Birrer, Madison Ueland, Miles Cranmer, Adriano Agnello, Sebastian Wagner-Carena, Philip J. Marshall, Aaron Roodman, The LSST Dark Energy Science Collaboration

For each test set of 1, 000 sightlines, the BGNN infers the individual $\kappa$ posteriors, which we combine in a hierarchical Bayesian model to yield constraints on the hyperparameters governing the population.

Graph Neural Network

Normalizing Flows for Hierarchical Bayesian Analysis: A Gravitational Wave Population Study

1 code implementation15 Nov 2022 David Ruhe, Kaze Wong, Miles Cranmer, Patrick Forré

We propose parameterizing the population distribution of the gravitational wave population modeling framework (Hierarchical Bayesian Analysis) with a normalizing flow.

A Neural Network Subgrid Model of the Early Stages of Planet Formation

no code implementations8 Nov 2022 Thomas Pfeil, Miles Cranmer, Shirley Ho, Philip J. Armitage, Tilman Birnstiel, Hubert Klahr

Planet formation is a multi-scale process in which the coagulation of $\mathrm{\mu m}$-sized dust grains in protoplanetary disks is strongly influenced by the hydrodynamic processes on scales of astronomical units ($\approx 1. 5\times 10^8 \,\mathrm{km}$).

Computational Efficiency

Automated discovery of interpretable gravitational-wave population models

1 code implementation25 Jul 2022 Kaze W. K Wong, Miles Cranmer

This demonstrates a strategy to automatically discover interpretable population models in the ever-growing GW catalog, which can potentially be applied to other astrophysical phenomena.

Symbolic Regression

Robust Simulation-Based Inference in Cosmology with Bayesian Neural Networks

1 code implementation18 Jul 2022 Pablo Lemos, Miles Cranmer, Muntazir Abidi, ChangHoon Hahn, Michael Eickenberg, Elena Massara, David Yallup, Shirley Ho

Simulation-based inference (SBI) is rapidly establishing itself as a standard machine learning technique for analyzing data in cosmological surveys.

Density Estimation

Predicting the Thermal Sunyaev-Zel'dovich Field using Modular and Equivariant Set-Based Neural Networks

1 code implementation28 Feb 2022 Leander Thiele, Miles Cranmer, William Coulton, Shirley Ho, David N. Spergel

We train neural networks on the IllustrisTNG-300 cosmological simulation to predict the continuous electron pressure field in galaxy clusters from gravity-only simulations.

Rediscovering orbital mechanics with machine learning

no code implementations4 Feb 2022 Pablo Lemos, Niall Jeffrey, Miles Cranmer, Shirley Ho, Peter Battaglia

We then use symbolic regression to discover an analytical expression for the force law implicitly learned by the neural network, which our results showed is equivalent to Newton's law of gravitation.

BIG-bench Machine Learning Graph Neural Network +1

Augmenting astrophysical scaling relations with machine learning: application to reducing the Sunyaev-Zeldovich flux-mass scatter

1 code implementation4 Jan 2022 Digvijay Wadekar, Leander Thiele, Francisco Villaescusa-Navarro, J. Colin Hill, Miles Cranmer, David N. Spergel, Nicholas Battaglia, Daniel Anglés-Alcázar, Lars Hernquist, Shirley Ho

Using SR on the data from the IllustrisTNG hydrodynamical simulation, we find a new proxy for cluster mass which combines $Y_\mathrm{SZ}$ and concentration of ionized gas ($c_\mathrm{gas}$): $M \propto Y_\mathrm{conc}^{3/5} \equiv Y_\mathrm{SZ}^{3/5} (1-A\, c_\mathrm{gas})$.

Symbolic Regression

Learned Coarse Models for Efficient Turbulence Simulation

1 code implementation31 Dec 2021 Kimberly Stachenfeld, Drummond B. Fielding, Dmitrii Kochkov, Miles Cranmer, Tobias Pfaff, Jonathan Godwin, Can Cui, Shirley Ho, Peter Battaglia, Alvaro Sanchez-Gonzalez

We show that our proposed model can simulate turbulent dynamics more accurately than classical numerical solvers at the comparably low resolutions across various scientifically relevant metrics.

Learned Simulators for Turbulence

no code implementations ICLR 2022 Kim Stachenfeld, Drummond Buschman Fielding, Dmitrii Kochkov, Miles Cranmer, Tobias Pfaff, Jonathan Godwin, Can Cui, Shirley Ho, Peter Battaglia, Alvaro Sanchez-Gonzalez

We show that our proposed model can simulate turbulent dynamics more accurately than classical numerical solvers at the same low resolutions across various scientifically relevant metrics.

Unsupervised Resource Allocation with Graph Neural Networks

1 code implementation17 Jun 2021 Miles Cranmer, Peter Melchior, Brian Nord

We present an approach for maximizing a global utility function by learning how to allocate resources in an unsupervised way.

Astronomy Evolutionary Algorithms

A Deep Learning Approach for Active Anomaly Detection of Extragalactic Transients

1 code implementation22 Mar 2021 V. Ashley Villar, Miles Cranmer, Edo Berger, Gabriella Contardo, Shirley Ho, Griffin Hosseinzadeh, Joshua Yao-Yu Lin

There is a shortage of multi-wavelength and spectroscopic followup capabilities given the number of transient and variable astrophysical events discovered through wide-field, optical surveys such as the upcoming Vera C. Rubin Observatory.

Anomaly Detection

A Bayesian neural network predicts the dissolution of compact planetary systems

2 code implementations11 Jan 2021 Miles Cranmer, Daniel Tamayo, Hanno Rein, Peter Battaglia, Samuel Hadden, Philip J. Armitage, Shirley Ho, David N. Spergel

Our model, trained directly from short N-body time series of raw orbital elements, is more than two orders of magnitude more accurate at predicting instability times than analytical estimators, while also reducing the bias of existing machine learning algorithms by nearly a factor of three.

BIG-bench Machine Learning Time Series Analysis

Predicting the long-term stability of compact multiplanet systems

1 code implementation13 Jul 2020 Daniel Tamayo, Miles Cranmer, Samuel Hadden, Hanno Rein, Peter Battaglia, Alysa Obertas, Philip J. Armitage, Shirley Ho, David Spergel, Christian Gilbertson, Naireen Hussain, Ari Silburt, Daniel Jontof-Hutter, Kristen Menou

Our Stability of Planetary Orbital Configurations Klassifier (SPOCK) predicts stability using physically motivated summary statistics measured in integrations of the first $10^4$ orbits, thus achieving speed-ups of up to $10^5$ over full simulations.

Earth and Planetary Astrophysics

Meta-Learning for One-Class Classification with Few Examples using Order-Equivariant Network

1 code implementation8 Jul 2020 Ademola Oladosu, Tony Xu, Philip Ekfeldt, Brian A. Kelly, Miles Cranmer, Shirley Ho, Adrian M. Price-Whelan, Gabriella Contardo

This paper presents a meta-learning framework for few-shots One-Class Classification (OCC) at test-time, a setting where labeled examples are only available for the positive class, and no supervision is given for the negative example.

Astronomy General Classification +3

Discovering Symbolic Models from Deep Learning with Inductive Biases

3 code implementations NeurIPS 2020 Miles Cranmer, Alvaro Sanchez-Gonzalez, Peter Battaglia, Rui Xu, Kyle Cranmer, David Spergel, Shirley Ho

The technique works as follows: we first encourage sparse latent representations when we train a GNN in a supervised setting, then we apply symbolic regression to components of the learned model to extract explicit physical relations.

Symbolic Regression

Cannot find the paper you are looking for? You can Submit a new open access paper.