1 code implementation • 8 May 2024 • Nicolas Boullé, Matthew J. Colbrook
Koopman operators are infinite-dimensional operators that linearize nonlinear dynamical systems, facilitating the study of their spectral properties and enabling the prediction of the time evolution of observable quantities.
1 code implementation • 1 Feb 2024 • Toni J. B. Liu, Nicolas Boullé, Raphaël Sarfati, Christopher J. Earls
Pretrained large language models (LLMs) are surprisingly effective at performing zero-shot tasks, including time-series forecasting.
1 code implementation • 31 Jan 2024 • Nicolas Boullé, Diana Halikias, Samuel E. Otto, Alex Townsend
There is a mystery at the heart of operator learning: how can one recover a non-self-adjoint operator from data without probing the adjoint?
no code implementations • 6 Jan 2024 • Nicolas Boullé, Matthew J. Colbrook
We show that, under suitable conditions, the eigenvalues and eigenfunctions of HDMD converge to the spectral properties of the underlying Koopman operator.
no code implementations • 22 Dec 2023 • Nicolas Boullé, Alex Townsend
We explain the types of problems and PDEs amenable to operator learning, discuss various neural network architectures, and explain how to employ numerical PDE solvers effectively.
1 code implementation • 24 Feb 2023 • Nicolas Boullé, Diana Halikias, Alex Townsend
PDE learning is an emerging field that combines physics and machine learning to recover unknown physical systems from experimental data.
1 code implementation • 28 Oct 2022 • Nicolas Boullé
Finally, theoretical results on Green's functions and rational NNs are combined to design a human-understandable deep learning method for discovering Green's functions from data.
no code implementations • 27 Apr 2022 • Nicolas Boullé, Seick Kim, Tianyi Shi, Alex Townsend
Neural operators are a popular technique in scientific machine learning to learn a mathematical model of the behavior of unknown physical systems from data.
no code implementations • ICLR 2022 • Nicolas Boullé, Alex Townsend
The randomized singular value decomposition (SVD) is a popular and effective algorithm for computing a near-best rank $k$ approximation of a matrix $A$ using matrix-vector products with standard Gaussian vectors.
2 code implementations • 1 May 2021 • Nicolas Boullé, Christopher J. Earls, Alex Townsend
There is an opportunity for deep learning to revolutionize science and technology by revealing its findings in a human interpretable manner.
no code implementations • 4 Feb 2021 • Ada J. Ellingsrud, Nicolas Boullé, Patrick E. Farrell, Marie E. Rognes
Mathematical modelling of ionic electrodiffusion and water movement is emerging as a powerful avenue of investigation to provide new physiological insight into brain homeostasis.
Numerical Analysis Computational Engineering, Finance, and Science Numerical Analysis
no code implementations • 31 Jan 2021 • Nicolas Boullé, Alex Townsend
Given input-output pairs of an elliptic partial differential equation (PDE) in three dimensions, we derive the first theoretically-rigorous scheme for learning the associated Green's function $G$.
3 code implementations • NeurIPS 2020 • Nicolas Boullé, Yuji Nakatsukasa, Alex Townsend
We consider neural networks with rational activation functions.
no code implementations • 26 Jul 2019 • Nicolas Boullé, Vassilios Dallas, Yuji Nakatsukasa, D. Samaddar
We use standard deep neural networks to classify univariate time series generated by discrete and continuous dynamical systems based on their chaotic or non-chaotic behaviour.