Search Results for author: Adeline Fermanian

Found 10 papers, 6 papers with code

Multivariate Online Linear Regression for Hierarchical Forecasting

no code implementations22 Feb 2024 Massil Hihat, Guillaume Garrigos, Adeline Fermanian, Simon Bussy

In this paper, we consider a deterministic online linear regression model where we allow the responses to be multivariate.

regression

Non-asymptotic Analysis of Biased Adaptive Stochastic Approximation

no code implementations5 Feb 2024 Sobihan Surendran, Antoine Godichon-Baggioni, Adeline Fermanian, Sylvain Le Corff

This paper provides a comprehensive non-asymptotic analysis of SGD with biased gradients and adaptive steps for convex and non-convex smooth functions.

Dynamical Survival Analysis with Controlled Latent States

no code implementations30 Jan 2024 Linus Bleistein, Van-Tuan Nguyen, Adeline Fermanian, Agathe Guilloux

We consider the task of learning individual-specific intensities of counting processes from a set of static variables and irregularly sampled time series.

Management Survival Analysis +1

New directions in the applications of rough path theory

no code implementations9 Feb 2023 Adeline Fermanian, Terry Lyons, James Morrill, Cristopher Salvi

This article provides a concise overview of some of the recent advances in the application of rough path theory to machine learning.

Learning the Dynamics of Sparsely Observed Interacting Systems

1 code implementation27 Jan 2023 Linus Bleistein, Adeline Fermanian, Anne-Sophie Jannot, Agathe Guilloux

We address the problem of learning the dynamics of an unknown non-parametric system linking a target and a feature time series.

Time Series Time Series Analysis

Scaling ResNets in the Large-depth Regime

1 code implementation14 Jun 2022 Pierre Marion, Adeline Fermanian, Gérard Biau, Jean-Philippe Vert

initializations, the only non-trivial dynamics is for $\alpha_L = 1/\sqrt{L}$ (other choices lead either to explosion or to identity mapping).

Framing RNN as a kernel method: A neural ODE approach

1 code implementation NeurIPS 2021 Adeline Fermanian, Pierre Marion, Jean-Philippe Vert, Gérard Biau

Building on the interpretation of a recurrent neural network (RNN) as a continuous-time neural differential equation, we show, under appropriate conditions, that the solution of a RNN can be viewed as a linear function of a specific feature set of the input sequence, known as the signature.

Functional linear regression with truncated signatures

1 code implementation15 Jun 2020 Adeline Fermanian

We place ourselves in a functional regression setting and propose a novel methodology for regressing a real output on vector-valued functional covariates.

Methodology 62R10 (Primary), 60L10 (Secondary)

Embedding and learning with signatures

1 code implementation29 Nov 2019 Adeline Fermanian

It is shown that a specific embedding, called lead-lag, is systematically the strongest performer across all datasets and algorithms considered.

Cannot find the paper you are looking for? You can Submit a new open access paper.