no code implementations • 22 Nov 2022 • Sotiris Anagnostidis, Arne Thomsen, Tomasz Kacprzak, Tilman Tröster, Luca Biggio, Alexandre Refregier, Thomas Hofmann
In this work, we aim to improve upon two-point statistics by employing a \textit{PointNet}-like neural network to regress the values of the cosmological parameters directly from point cloud data.
1 code implementation • 13 Jan 2021 • Maximilian von Wietersheim-Kramsta, Benjamin Joachimi, Jan Luca van den Busch, Catherine Heymans, Hendrik Hildebrandt, Marika Asgari, Tilman Tröster, Angus H. Wright
For BOSS-like lenses, we forecast a contribution of the magnification bias to the GGL signal between the multipole moments, $\ell$, of 100 and 4600 with a cumulative signal-to-noise ratio between 0. 1 and 1. 1 for sources from the Kilo-Degree Survey (KiDS), between 0. 4 and 2. 0 for sources from the Hyper Suprime-Cam survey (HSC), and between 0. 3 and 2. 8 for ESA Euclid-like source samples.
Cosmology and Nongalactic Astrophysics
3 code implementations • 3 Sep 2020 • Alexander Mead, Samuel Brieden, Tilman Tröster, Catherine Heymans
We present an updated version of the HMcode augmented halo model that can be used to make accurate predictions of the non-linear matter power spectrum over a wide range of cosmologies.
Cosmology and Nongalactic Astrophysics
2 code implementations • 3 Jul 2020 • Benjamin Giblin, Catherine Heymans, Marika Asgari, Hendrik Hildebrandt, Henk Hoekstra, Benjamin Joachimi, Arun Kannawadi, Konrad Kuijken, Chieh-An Lin, Lance Miller, Tilman Tröster, Jan Luca van den Busch, Angus H. Wright, Maciej Bilicki, Chris Blake, Jelte de Jong, Andrej Dvornik, Thomas Erben, Fedor Getman, Nicola R. Napolitano, Peter Schneider, HuanYuan Shan
We present weak lensing shear catalogues from the fourth data release of the Kilo-Degree Survey, KiDS-1000, spanning 1006 square degrees of deep and high-resolution imaging.
Cosmology and Nongalactic Astrophysics
1 code implementation • 28 Mar 2019 • Tilman Tröster, Cameron Ferguson, Joachim Harnois-Déraps, Ian G. McCarthy
We train two deep generative models, a variational auto-encoder and a generative adversarial network, on pairs of matter density and pressure slices from the BAHAMAS hydrodynamical simulation.