Search Results for author: Niccolò Dalmasso

Found 18 papers, 6 papers with code

Fair Coresets via Optimal Transport

no code implementations9 Nov 2023 Zikai Xiong, Niccolò Dalmasso, Shubham Sharma, Freddy Lecue, Daniele Magazzeni, Vamsi K. Potluru, Tucker Balch, Manuela Veloso

In this work, we present fair Wasserstein coresets (FWC), a novel coreset approach which generates fair synthetic representative samples along with sample-level weights to be used in downstream learning tasks.

Clustering Decision Making +1

FairWASP: Fast and Optimal Fair Wasserstein Pre-processing

no code implementations31 Oct 2023 Zikai Xiong, Niccolò Dalmasso, Alan Mishler, Vamsi K. Potluru, Tucker Balch, Manuela Veloso

FairWASP can therefore be used to construct datasets which can be fed into any classification method, not just methods which accept sample weights.

Fairness

Deep Gaussian Mixture Ensembles

no code implementations12 Jun 2023 Yousef El-Laham, Niccolò Dalmasso, Elizabeth Fons, Svitlana Vyetrenko

This work introduces a novel probabilistic deep learning technique called deep Gaussian mixture ensembles (DGMEs), which enables accurate quantification of both epistemic and aleatoric uncertainty.

Probabilistic Deep Learning

Fast Learning of Multidimensional Hawkes Processes via Frank-Wolfe

no code implementations12 Dec 2022 Renbo Zhao, Niccolò Dalmasso, Mohsen Ghassemi, Vamsi K. Potluru, Tucker Balch, Manuela Veloso

Hawkes processes have recently risen to the forefront of tools when it comes to modeling and generating sequential events data.

Epidemiology

Online Learning for Mixture of Multivariate Hawkes Processes

no code implementations16 Aug 2022 Mohsen Ghassemi, Niccolò Dalmasso, Simran Lamba, Vamsi K. Potluru, Sameena Shah, Tucker Balch, Manuela Veloso

Online learning of Hawkes processes has received increasing attention in the last couple of years especially for modeling a network of actors.

Differentially Private Learning of Hawkes Processes

no code implementations27 Jul 2022 Mohsen Ghassemi, Eleonora Kreačić, Niccolò Dalmasso, Vamsi K. Potluru, Tucker Balch, Manuela Veloso

Hawkes processes have recently gained increasing attention from the machine learning community for their versatility in modeling event sequence data.

Fair When Trained, Unfair When Deployed: Observable Fairness Measures are Unstable in Performative Prediction Settings

no code implementations10 Feb 2022 Alan Mishler, Niccolò Dalmasso

These measures are sensitive to distribution shift: a predictor which is trained to satisfy one of these fairness definitions may become unfair if the distribution changes.

counterfactual Fairness

Likelihood-Free Frequentist Inference: Bridging Classical Statistics and Machine Learning for Reliable Simulator-Based Inference

2 code implementations8 Jul 2021 Niccolò Dalmasso, Luca Masserano, David Zhao, Rafael Izbicki, Ann B. Lee

In this work, we propose a unified and modular inference framework that bridges classical statistics and modern machine learning providing (i) a practical approach to the Neyman construction of confidence sets with frequentist finite-sample coverage for any value of the unknown parameters; and (ii) interpretable diagnostics that estimate the empirical coverage across the entire parameter space.

Open-Ended Question Answering valid

HECT: High-Dimensional Ensemble Consistency Testing for Climate Models

no code implementations8 Oct 2020 Niccolò Dalmasso, Galen Vincent, Dorit Hammerling, Ann B. Lee

Climate models play a crucial role in understanding the effect of environmental and man-made changes on climate to help mitigate climate risks and inform governmental decisions.

Vocal Bursts Intensity Prediction

Confidence Sets and Hypothesis Testing in a Likelihood-Free Inference Setting

2 code implementations ICML 2020 Niccolò Dalmasso, Rafael Izbicki, Ann B. Lee

In this paper, we present $\texttt{ACORE}$ (Approximate Computation via Odds Ratio Estimation), a frequentist approach to LFI that first formulates the classical likelihood ratio test (LRT) as a parametrized classification problem, and then uses the equivalence of tests and confidence sets to build confidence regions for parameters of interest.

Two-sample testing

Explicit Group Sparse Projection with Applications to Deep Learning and NMF

no code implementations9 Dec 2019 Riyasat Ohib, Nicolas Gillis, Niccolò Dalmasso, Sameena Shah, Vamsi K. Potluru, Sergey Plis

Instead, in our approach we set the sparsity level for the whole set explicitly and simultaneously project a group of vectors with the sparsity level of each vector tuned automatically.

Network Pruning

Conditional Density Estimation Tools in Python and R with Applications to Photometric Redshifts and Likelihood-Free Cosmological Inference

5 code implementations30 Aug 2019 Niccolò Dalmasso, Taylor Pospisil, Ann B. Lee, Rafael Izbicki, Peter E. Freeman, Alex I. Malz

We provide sample code in $\texttt{Python}$ and $\texttt{R}$ as well as examples of applications to photometric redshift estimation and likelihood-free cosmological inference via CDE.

Astronomy Density Estimation +2

A Flexible Pipeline for Prediction of Tropical Cyclone Paths

1 code implementation20 Jun 2019 Niccolò Dalmasso, Robin Dunn, Benjamin LeRoy, Chad Schafer

Hurricanes and, more generally, tropical cyclones (TCs) are rare, complex natural phenomena of both scientific and public interest.

Applications

Validation of Approximate Likelihood and Emulator Models for Computationally Intensive Simulations

1 code implementation27 May 2019 Niccolò Dalmasso, Ann B. Lee, Rafael Izbicki, Taylor Pospisil, Ilmun Kim, Chieh-An Lin

At the heart of our approach is a two-sample test that quantifies the quality of the fit at fixed parameter values, and a global test that assesses goodness-of-fit across simulation parameters.

Clarifying the Hubble constant tension with a Bayesian hierarchical model of the local distance ladder

1 code implementation30 Jun 2017 Stephen M. Feeney, Daniel J. Mortlock, Niccolò Dalmasso

Estimates of the Hubble constant, $H_0$, from the distance ladder and the cosmic microwave background (CMB) differ at the $\sim$3-$\sigma$ level, indicating a potential issue with the standard $\Lambda$CDM cosmology.

Cosmology and Nongalactic Astrophysics

Cannot find the paper you are looking for? You can Submit a new open access paper.