no code implementations • 6 Apr 2024 • Gianluca Detommaso, Martin Bertran, Riccardo Fogliato, Aaron Roth
This paper proposes the use of "multicalibration" to yield interpretable and reliable confidence scores for outputs generated by large language models (LLMs).
1 code implementation • 8 Feb 2023 • Gianluca Detommaso, Alberto Gasparin, Michele Donini, Matthias Seeger, Andrew Gordon Wilson, Cedric Archambeau
We present Fortuna, an open-source library for uncertainty quantification in deep learning.
no code implementations • 17 Jul 2022 • Gianluca Detommaso, Alberto Gasparin, Andrew Wilson, Cedric Archambeau
As we move away from the data, the predictive uncertainty should increase, since a great variety of explanations are consistent with the little available information.
no code implementations • 17 Jun 2021 • Gianluca Detommaso, Michael Brückner, Philip Schulz, Victor Chernozhukov
We extend the definition of the marginal causal effect to the continuous treatment setting and develop a novel characterization of causal bias in the framework of structural causal models.
1 code implementation • 25 May 2019 • Jakob Kruse, Gianluca Detommaso, Ullrich Köthe, Robert Scheichl
Many recent invertible neural architectures are based on coupling block designs where variables are divided in two subsets which serve as inputs of an easily invertible (usually affine) triangular transformation.
1 code implementation • 23 Jan 2019 • Gianluca Detommaso, Hanne Hoitzing, Tiangang Cui, Ardavan Alamir
Bayesian online changepoint detection (BOCPD) (Adams & MacKay, 2007) offers a rigorous and viable way to identify changepoints in complex systems.
1 code implementation • NeurIPS 2018 • Gianluca Detommaso, Tiangang Cui, Alessio Spantini, Youssef Marzouk, Robert Scheichl
Stein variational gradient descent (SVGD) was recently proposed as a general purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]: it minimizes the Kullback-Leibler divergence between the target distribution and its approximation by implementing a form of functional gradient descent on a reproducing kernel Hilbert space.