Search Results for author: Gianluca Detommaso

Found 7 papers, 4 papers with code

Multicalibration for Confidence Scoring in LLMs

no code implementations6 Apr 2024 Gianluca Detommaso, Martin Bertran, Riccardo Fogliato, Aaron Roth

This paper proposes the use of "multicalibration" to yield interpretable and reliable confidence scores for outputs generated by large language models (LLMs).

Benchmarking Question Answering

Uncertainty Calibration in Bayesian Neural Networks via Distance-Aware Priors

no code implementations17 Jul 2022 Gianluca Detommaso, Alberto Gasparin, Andrew Wilson, Cedric Archambeau

As we move away from the data, the predictive uncertainty should increase, since a great variety of explanations are consistent with the little available information.

regression

Causal Bias Quantification for Continuous Treatments

no code implementations17 Jun 2021 Gianluca Detommaso, Michael Brückner, Philip Schulz, Victor Chernozhukov

We extend the definition of the marginal causal effect to the continuous treatment setting and develop a novel characterization of causal bias in the framework of structural causal models.

Selection bias

HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference

1 code implementation25 May 2019 Jakob Kruse, Gianluca Detommaso, Ullrich Köthe, Robert Scheichl

Many recent invertible neural architectures are based on coupling block designs where variables are divided in two subsets which serve as inputs of an easily invertible (usually affine) triangular transformation.

Bayesian Inference Density Estimation

Stein Variational Online Changepoint Detection with Applications to Hawkes Processes and Neural Networks

1 code implementation23 Jan 2019 Gianluca Detommaso, Hanne Hoitzing, Tiangang Cui, Ardavan Alamir

Bayesian online changepoint detection (BOCPD) (Adams & MacKay, 2007) offers a rigorous and viable way to identify changepoints in complex systems.

A Stein variational Newton method

1 code implementation NeurIPS 2018 Gianluca Detommaso, Tiangang Cui, Alessio Spantini, Youssef Marzouk, Robert Scheichl

Stein variational gradient descent (SVGD) was recently proposed as a general purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]: it minimizes the Kullback-Leibler divergence between the target distribution and its approximation by implementing a form of functional gradient descent on a reproducing kernel Hilbert space.

Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.