Search Results for author: Florian Krach

Found 8 papers, 7 papers with code

Robust Utility Optimization via a GAN Approach

1 code implementation22 Mar 2024 Florian Krach, Josef Teichmann, Hanna Wutte

Lastly, we uncover that our generative approach for learning optimal, (non-) robust investments under trading costs generates universally applicable alternatives to well known asymptotic strategies of idealized settings.

Generative Adversarial Network

Regret-Optimal Federated Transfer Learning for Kernel Regression with Applications in American Option Pricing

1 code implementation8 Sep 2023 Xuwei Yang, Anastasis Kratsios, Florian Krach, Matheus Grasselli, Aurelien Lucchi

We propose an optimal iterative scheme for federated transfer learning, where a central planner has access to datasets ${\cal D}_1,\dots,{\cal D}_N$ for the same learning model $f_{\theta}$.

Adversarial Robustness regression +1

Extending Path-Dependent NJ-ODEs to Noisy Observations and a Dependent Observation Framework

1 code implementation24 Jul 2023 William Andersson, Jakob Heiss, Florian Krach, Josef Teichmann

The Path-Dependent Neural Jump Ordinary Differential Equation (PD-NJ-ODE) is a model for predicting continuous-time stochastic processes with irregular and incomplete observations.

Time Series

Optimal Estimation of Generic Dynamics by Path-Dependent Neural Jump ODEs

1 code implementation28 Jun 2022 Florian Krach, Marc Nübel, Josef Teichmann

This paper studies the problem of forecasting general stochastic processes using a path-dependent extension of the Neural Jump ODE (NJ-ODE) framework \citep{herrera2021neural}.

Time Series Time Series Analysis

Optimal Stopping via Randomized Neural Networks

2 code implementations28 Apr 2021 Calypso Herrera, Florian Krach, Pierre Ruyssen, Josef Teichmann

This paper presents the benefits of using randomized neural networks instead of standard basis functions or deep neural networks to approximate the solutions of optimal stopping problems.

BIG-bench Machine Learning

Neural Jump Ordinary Differential Equations: Consistent Continuous-Time Prediction and Filtering

2 code implementations ICLR 2021 Calypso Herrera, Florian Krach, Josef Teichmann

We introduce the Neural Jump ODE (NJ-ODE) that provides a data-driven approach to learn, continuously in time, the conditional expectation of a stochastic process.

Time Series Time Series Analysis

Denise: Deep Robust Principal Component Analysis for Positive Semidefinite Matrices

1 code implementation28 Apr 2020 Calypso Herrera, Florian Krach, Anastasis Kratsios, Pierre Ruyssen, Josef Teichmann

The robust PCA of covariance matrices plays an essential role when isolating key explanatory features.

Local Lipschitz Bounds of Deep Neural Networks

no code implementations27 Apr 2020 Calypso Herrera, Florian Krach, Josef Teichmann

The Lipschitz constant is an important quantity that arises in analysing the convergence of gradient-based optimization methods.

Cannot find the paper you are looking for? You can Submit a new open access paper.