An application of machine learning techniques to galaxy cluster mass estimation using the MACSIS simulations

19 Oct 2018Thomas J. ArmitageScott T. KayDavid J. Barnes

Machine learning (ML) techniques, in particular supervised regression algorithms, are a promising new way to use multiple observables to predict a cluster's mass or other key features. To investigate this approach we use the \textsc{MACSIS} sample of simulated hydrodynamical galaxy clusters to train a variety of ML models, mimicking different datasets... (read more)

PDF Abstract


No code implementations yet. Submit your code now


Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.