no code implementations • 2 Feb 2024 • Zhe Jiao, Martin Keller-Ressel
It has repeatedly been observed that loss minimization by stochastic gradient descent (SGD) leads to heavy-tailed distributions of neural network parameters.
no code implementations • 11 May 2023 • Pascal Mettes, Mina Ghadimi Atigh, Martin Keller-Ressel, Jeffrey Gu, Serena Yeung
In this paper, we provide a categorization and in-depth overview of current literature on hyperbolic learning for computer vision.
no code implementations • 24 Mar 2023 • Martin Keller-Ressel, Felix Sachse
Using the concept of envelopes we show how to divide the state space $\RR^2$ of the two-factor Vasicek model into regions of identical term-structure shape.
no code implementations • 29 Sep 2022 • Martin Keller-Ressel
In liquid option markets, W-shaped implied volatility curves have occasionally be observed.
no code implementations • 27 Jul 2022 • Martin Keller-Ressel
We derive analytic expressions for the variance-optimal hedging strategy and its mean-square hedging error in the lognormal SABR and in the rough Bergomi model.
no code implementations • 14 Jul 2022 • Martin Keller-Ressel, Stephanie Nargang
We introduce L-hydra (landmarked hyperbolic distance recovery and approximation), a method for embedding network- or distance-based data into hyperbolic space, which requires only the distance measurements to a few 'landmark nodes'.
1 code implementation • NeurIPS 2021 • Mina Ghadimi Atigh, Martin Keller-Ressel, Pascal Mettes
To be able to compute proximities to ideal prototypes, we introduce the penalised Busemann loss.
no code implementations • 15 Oct 2020 • Martin Keller-Ressel
We introduce Hyperbolic Prototype Learning, a type of supervised learning, where class labels are represented by ideal points (points at infinity) in hyperbolic space.
no code implementations • 1 May 2020 • Martin Keller-Ressel, Stephanie Nargang
Based on data from the European banking stress tests of 2014, 2016 and the transparency exercise of 2018 we demonstrate for the first time that the latent geometry of financial networks can be well-represented by geometry of negative curvature, i. e., by hyperbolic geometry.
no code implementations • 13 Aug 2019 • Martin Keller-Ressel
We provide a full classification of all attainable term structure shapes in the two-factor Vasicek model of interest rates.
no code implementations • 21 Mar 2019 • Martin Keller-Ressel, Stephanie Nargang
We introduce hydra (hyperbolic distance recovery and approximation), a new method for embedding network- or distance-based data into hyperbolic space.
1 code implementation • 21 Nov 2017 • Björn Böttcher, Martin Keller-Ressel, René L. Schilling
Distance covariance is a quantity to measure the dependence of two random vectors.
Probability Statistics Theory Statistics Theory
1 code implementation • 21 Nov 2017 • Björn Böttcher, Martin Keller-Ressel, René L. Schilling
We introduce two new measures for the dependence of $n \ge 2$ random variables: distance multivariance and total distance multivariance.
Probability Statistics Theory Statistics Theory 62H20, 60E10, 62G10, 62G15, 62G20