no code implementations • 21 Jun 2023 • L. Storm, H. Linander, J. Bec, K. Gustavsson, B. Mehlig
We compute how small input perturbations affect the output of deep neural networks, exploring an analogy between deep networks and dynamical systems, where the growth or decay of local perturbations is characterised by finite-time Lyapunov exponents.
no code implementations • 26 Nov 2022 • H. Linander, O. Balabanov, H. Yang, B. Mehlig
Here we show that prediction accuracy depends on both epistemic and aleatoric uncertainty in an intricate fashion that cannot be understood in terms of marginalized uncertainty distributions alone.