no code implementations • 29 Dec 2023 • Vamsi K. Potluru, Daniel Borrajo, Andrea Coletta, Niccolò Dalmasso, Yousef El-Laham, Elizabeth Fons, Mohsen Ghassemi, Sriram Gopalakrishnan, Vikesh Gosai, Eleonora Kreačić, Ganapathy Mani, Saheed Obitayo, Deepak Paramanand, Natraj Raman, Mikhail Solonin, Srijan Sood, Svitlana Vyetrenko, Haibei Zhu, Manuela Veloso, Tucker Balch
Synthetic data has made tremendous strides in various commercial settings including finance, healthcare, and virtual reality.
no code implementations • 9 Nov 2023 • Zikai Xiong, Niccolò Dalmasso, Shubham Sharma, Freddy Lecue, Daniele Magazzeni, Vamsi K. Potluru, Tucker Balch, Manuela Veloso
Data distillation and coresets have emerged as popular approaches to generate a smaller representative set of samples for downstream learning tasks to handle large-scale datasets.
no code implementations • 31 Oct 2023 • Zikai Xiong, Niccolò Dalmasso, Alan Mishler, Vamsi K. Potluru, Tucker Balch, Manuela Veloso
FairWASP can therefore be used to construct datasets which can be fed into any classification method, not just methods which accept sample weights.
no code implementations • 12 Jun 2023 • Yousef El-Laham, Niccolò Dalmasso, Elizabeth Fons, Svitlana Vyetrenko
This work introduces a novel probabilistic deep learning technique called deep Gaussian mixture ensembles (DGMEs), which enables accurate quantification of both epistemic and aleatoric uncertainty.
no code implementations • 12 Dec 2022 • Renbo Zhao, Niccolò Dalmasso, Mohsen Ghassemi, Vamsi K. Potluru, Tucker Balch, Manuela Veloso
Hawkes processes have recently risen to the forefront of tools when it comes to modeling and generating sequential events data.
no code implementations • 16 Aug 2022 • Mohsen Ghassemi, Niccolò Dalmasso, Simran Lamba, Vamsi K. Potluru, Sameena Shah, Tucker Balch, Manuela Veloso
Online learning of Hawkes processes has received increasing attention in the last couple of years especially for modeling a network of actors.
no code implementations • 27 Jul 2022 • Mohsen Ghassemi, Eleonora Kreačić, Niccolò Dalmasso, Vamsi K. Potluru, Tucker Balch, Manuela Veloso
Hawkes processes have recently gained increasing attention from the machine learning community for their versatility in modeling event sequence data.
no code implementations • 10 Feb 2022 • Alan Mishler, Niccolò Dalmasso
These measures are sensitive to distribution shift: a predictor which is trained to satisfy one of these fairness definitions may become unfair if the distribution changes.
2 code implementations • 8 Jul 2021 • Niccolò Dalmasso, Luca Masserano, David Zhao, Rafael Izbicki, Ann B. Lee
In this work, we propose a modular inference framework that bridges classical statistics and modern machine learning to provide (i) a practical approach for constructing confidence sets with near finite-sample validity at any value of the unknown parameters, and (ii) interpretable diagnostics for estimating empirical coverage across the entire parameter space.
no code implementations • 8 Oct 2020 • Niccolò Dalmasso, Galen Vincent, Dorit Hammerling, Ann B. Lee
Climate models play a crucial role in understanding the effect of environmental and man-made changes on climate to help mitigate climate risks and inform governmental decisions.
no code implementations • 7 Oct 2020 • Trey McNeely, Niccolò Dalmasso, Kimberly M. Wood, Ann B. Lee
Tropical cyclone (TC) intensity forecasts are ultimately issued by human forecasters.
2 code implementations • ICML 2020 • Niccolò Dalmasso, Rafael Izbicki, Ann B. Lee
In this paper, we present $\texttt{ACORE}$ (Approximate Computation via Odds Ratio Estimation), a frequentist approach to LFI that first formulates the classical likelihood ratio test (LRT) as a parametrized classification problem, and then uses the equivalence of tests and confidence sets to build confidence regions for parameters of interest.
no code implementations • 9 Dec 2019 • Riyasat Ohib, Nicolas Gillis, Niccolò Dalmasso, Sameena Shah, Vamsi K. Potluru, Sergey Plis
Instead, in our approach we set the sparsity level for the whole set explicitly and simultaneously project a group of vectors with the sparsity level of each vector tuned automatically.
no code implementations • 18 Oct 2019 • Matteo Sordello, Niccolò Dalmasso, Hangfeng He, Weijie Su
This paper proposes SplitSGD, a new dynamic learning rate schedule for stochastic optimization.
5 code implementations • 30 Aug 2019 • Niccolò Dalmasso, Taylor Pospisil, Ann B. Lee, Rafael Izbicki, Peter E. Freeman, Alex I. Malz
We provide sample code in $\texttt{Python}$ and $\texttt{R}$ as well as examples of applications to photometric redshift estimation and likelihood-free cosmological inference via CDE.
1 code implementation • 20 Jun 2019 • Niccolò Dalmasso, Robin Dunn, Benjamin LeRoy, Chad Schafer
Hurricanes and, more generally, tropical cyclones (TCs) are rare, complex natural phenomena of both scientific and public interest.
Applications
1 code implementation • 27 May 2019 • Niccolò Dalmasso, Ann B. Lee, Rafael Izbicki, Taylor Pospisil, Ilmun Kim, Chieh-An Lin
At the heart of our approach is a two-sample test that quantifies the quality of the fit at fixed parameter values, and a global test that assesses goodness-of-fit across simulation parameters.
1 code implementation • 30 Jun 2017 • Stephen M. Feeney, Daniel J. Mortlock, Niccolò Dalmasso
Estimates of the Hubble constant, $H_0$, from the distance ladder and the cosmic microwave background (CMB) differ at the $\sim$3-$\sigma$ level, indicating a potential issue with the standard $\Lambda$CDM cosmology.
Cosmology and Nongalactic Astrophysics