You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 7 Jan 2022 • Erdinc Akyildirim, Matteo Gambara, Josef Teichmann, Syang Zhou

In this case, we are able to identify pump and dump attempts organized on social networks with F1 scores up to 88% by means of our unsupervised learning algorithm, thus achieving results that are close to the state-of-the-art in the field based on supervised learning.

no code implementations • 2 Jan 2022 • Enea Monzio Compagnoni, Luca Biggio, Antonio Orvieto, Thomas Hofmann, Josef Teichmann

Time series analysis is a widespread task in Natural Sciences, Social Sciences, and Engineering.

no code implementations • 31 Dec 2021 • Jakob Heiss, Josef Teichmann, Hanna Wutte

We prove in this paper that optimizing wide ReLU neural networks (NNs) with at least one hidden layer using l2-regularization on the parameters enforces multi-task learning due to representation-learning - also in the limit of width to infinity.

no code implementations • NeurIPS Workshop DLDE 2021 • Christa Cuchiero, Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega, Josef Teichmann

We consider the question whether the time evolution of controlled differential equations on general state spaces can be arbitrarily well approximated by (regularized) regressions on features generated themselves through randomly chosen dynamical systems of moderately high dimension.

2 code implementations • 28 Apr 2021 • Calypso Herrera, Florian Krack, Pierre Ruyssen, Josef Teichmann

This paper presents new machine learning approaches to approximate the solution of optimal stopping problems.

1 code implementation • 26 Feb 2021 • Jakob Heiss, Jakob Weissteiner, Hanna Wutte, Sven Seuken, Josef Teichmann

To address this, we introduce a new approach for capturing model uncertainty for NNs, which we call Neural Optimization-based Model Uncertainty (NOMU).

no code implementations • 3 Feb 2021 • Blanka Horvath, Josef Teichmann, Zan Zuric

We investigate the performance of the Deep Hedging framework under training paths beyond the (finite dimensional) Markovian setup.

no code implementations • 3 Feb 2021 • Nicolas Curin, Michael Kettler, Xi Kleisinger-Yu, Vlatka Komaric, Thomas Krabichler, Josef Teichmann, Hanna Wutte

To the best of our knowledge, the application of deep learning in the field of quantitative risk management is still a relatively recent phenomenon.

no code implementations • 1 Jan 2021 • Jakob Heiss, Alexis Stockinger, Josef Teichmann

We introduce a new Reduction Algorithm which makes use of the properties of ReLU neurons to reduce significantly the number of neurons in a trained Deep Neural Network.

no code implementations • 17 Sep 2020 • Christa Cuchiero, Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega, Josef Teichmann

A new explanation of geometric nature of the reservoir computing phenomenon is presented.

no code implementations • 10 Sep 2020 • Thomas Krabichler, Josef Teichmann

To the best of our knowledge, the application of deep learning in the field of quantitative risk management is still a relatively recent phenomenon.

no code implementations • 24 Jun 2020 • Paul Friedrich, Josef Teichmann

The Kyle model describes how an equilibrium of order sizes and security prices naturally arises between a trader with insider information and the price providing market maker as they interact through a series of auctions.

Computational Finance Trading and Market Microstructure

no code implementations • 16 Jun 2020 • Matteo Gambara, Josef Teichmann

Consistent Recalibration models (CRC) have been introduced to capture in necessary generality the dynamic features of term structures of derivatives' prices.

2 code implementations • ICLR 2021 • Calypso Herrera, Florian Krach, Josef Teichmann

We introduce the Neural Jump ODE (NJ-ODE) that provides a data-driven approach to learn, continuously in time, the conditional expectation of a stochastic process.

1 code implementation • 5 May 2020 • Christa Cuchiero, Wahid Khosrawi, Josef Teichmann

We propose a fully data-driven approach to calibrate local stochastic volatility (LSV) models, circumventing in particular the ad hoc interpolation of the volatility surface.

1 code implementation • 28 Apr 2020 • Calypso Herrera, Florian Krach, Anastasis Kratsios, Pierre Ruyssen, Josef Teichmann

The robust PCA of covariance matrices plays an essential role when isolating key explanatory features.

no code implementations • 27 Apr 2020 • Calypso Herrera, Florian Krach, Josef Teichmann

We estimate the Lipschitz constants of the gradient of a deep neural network and the network itself with respect to the full set of parameters.

no code implementations • 7 Nov 2019 • Jakob Heiss, Josef Teichmann, Hanna Wutte

These observations motivate us to analyze properties of the neural networks found by gradient descent initialized close to zero, that is frequently employed to perform the training task.

2 code implementations • 8 Feb 2018 • Hans Bühler, Lukas Gonon, Josef Teichmann, Ben Wood

We present a framework for hedging a portfolio of derivatives in the presence of market frictions such as transaction costs, market impact, liquidity constraints or risk limits using modern deep reinforcement machine learning methods.

Computational Finance Numerical Analysis Optimization and Control Probability Risk Management 91G60, 65K99

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.