Paper

An application of machine learning techniques to galaxy cluster mass estimation using the MACSIS simulations

Machine learning (ML) techniques, in particular supervised regression algorithms, are a promising new way to use multiple observables to predict a cluster's mass or other key features. To investigate this approach we use the \textsc{MACSIS} sample of simulated hydrodynamical galaxy clusters to train a variety of ML models, mimicking different datasets... (read more)

Results in Papers With Code
(↓ scroll down to see all results)