no code implementations • ICML 2018 • Daniel Sheldon, Kevin Winner, Debora Sujono
We develop nested automatic differentiation (AD) algorithms for exact inference and learning in integer latent variable models.
no code implementations • 14 Jun 2017 • Garrett Bernstein, Ryan McKenna, Tao Sun, Daniel Sheldon, Michael Hay, Gerome Miklau
We investigate the problem of learning discrete, undirected graphical models in a differentially private way.
no code implementations • 1 Dec 2016 • Xiaojian Wu, Akshat Kumar, Daniel Sheldon, Shlomo Zilberstein
We therefore address the robust river network design problem where the goal is to optimize river connectivity for fish movement by removing barriers.
no code implementations • 4 Mar 2015 • Luke Vilnis, David Belanger, Daniel Sheldon, Andrew McCallum
Many inference problems in structured prediction are naturally solved by augmenting a tractable dependency structure with complex, non-local auxiliary objectives.
no code implementations • 14 Apr 2016 • Garrett Bernstein, Daniel Sheldon
We develop a new, simpler method of moments estimator that bypasses this problem and is consistent under noisy observations.
no code implementations • 20 May 2014 • Li-Ping Liu, Daniel Sheldon, Thomas G. Dietterich
The Collective Graphical Model (CGM) models a population of independent and identically distributed individuals when only collective statistics (i. e., counts of individuals) are observed.
no code implementations • NeurIPS 2018 • Justin Domke, Daniel Sheldon
Recent work used importance sampling ideas for better variational bounds on likelihoods.
no code implementations • ICML 2017 • Garrett Bernstein, Ryan McKenna, Tao Sun, Daniel Sheldon, Michael Hay, Gerome Miklau
A naive learning algorithm that uses the noisy sufficient statistics “as is” outperforms general-purpose differentially private learning algorithms.
no code implementations • NeurIPS 2019 • Justin Domke, Daniel Sheldon
Recent work in variational inference (VI) uses ideas from Monte Carlo estimation to tighten the lower bounds on the log-likelihood that are used as objectives.
no code implementations • 24 Apr 2020 • Zezhou Cheng, Saadia Gabriel, Pankaj Bhambhani, Daniel Sheldon, Subhransu Maji, Andrew Laughlin, David Winkler
The US weather radar archive holds detailed information about biological phenomena in the atmosphere over the last 20 years.
no code implementations • 14 Jun 2020 • Cecilia Ferrando, Shufan Wang, Daniel Sheldon
The goal of this paper is to develop a practical and general-purpose approach to construct confidence intervals for differentially private parametric estimation.
no code implementations • NeurIPS 2020 • Abhinav Agrawal, Daniel Sheldon, Justin Domke
The combination of these algorithmic components significantly advances the state-of-the-art "out of the box" variational inference.
no code implementations • 23 Jun 2020 • Edmond Cunningham, Renos Zabounidis, Abhinav Agrawal, Ina Fiterau, Daniel Sheldon
In this paper we introduce noisy injective flows (NIF), a generalization of normalizing flows that can go across dimensions.
no code implementations • 31 Dec 2020 • Shiv Shankar, Daniel Sheldon, Tao Sun, John Pickering, Thomas G. Dietterich
However, it will remove intrinsic variability if the variables are dependent, and therefore does not apply to many situations, including modeling of species counts that are controlled by common causes.
no code implementations • 28 Jan 2021 • Mohit Yadav, Daniel Sheldon, Cameron Musco
Structured kernel interpolation (SKI) is among the most scalable methods: by placing inducing points on a dense grid and using structured matrix algebra, SKI achieves per-iteration time of O(n + m log m) for approximate inference.
no code implementations • 3 Jul 2021 • Shiv Shankar, Daniel Sheldon
Field observations form the basis of many scientific studies, especially in ecological and social sciences.
no code implementations • pproximateinference AABI Symposium 2022 • Javier Burroni, Kenta Takatsu, Justin Domke, Daniel Sheldon
We propose the use of U-statistics to reduce variance for gradient estimation in importance-weighted variational inference.
no code implementations • 13 Apr 2023 • Javier Burroni, Justin Domke, Daniel Sheldon
We present a novel approach for black-box VI that bypasses the difficulties of stochastic gradient ascent, including the task of selecting step-sizes.
1 code implementation • 23 May 2023 • Mohit Yadav, Daniel Sheldon, Cameron Musco
Structured kernel interpolation (SKI) accelerates Gaussian process (GP) inference by interpolating the kernel covariance function using a dense grid of inducing points, whose corresponding kernel matrix is highly structured and thus amenable to fast linear algebra.
no code implementations • 5 Jun 2023 • Gustavo Perez, Subhransu Maji, Daniel Sheldon
Many modern applications use computer vision to detect and count objects in massive image collections.
no code implementations • 8 Dec 2023 • Gustavo Perez, Daniel Sheldon, Grant van Horn, Subhransu Maji
Human feedback on the pairwise similarity can be used to improve the clustering, but existing approaches do not guarantee accurate count estimates.
1 code implementation • 12 Mar 2024 • Miguel Fuentes, Brett Mullins, Ryan McKenna, Gerome Miklau, Daniel Sheldon
This technique allows for public data to be included in a graphical-model-based mechanism.
1 code implementation • NeurIPS 2018 • Garrett Bernstein, Daniel Sheldon
The study of private inference has been sparked by growing concern regarding the analysis of data when it stems from sensitive sources.
1 code implementation • 1 Feb 2023 • Jinlin Lai, Javier Burroni, Hui Guan, Daniel Sheldon
Hamiltonian Monte Carlo (HMC) is a powerful algorithm to sample latent variables from Bayesian models.
1 code implementation • NeurIPS 2019 • Garrett Bernstein, Daniel Sheldon
Linear regression is an important tool across many fields that work with sensitive human-sourced data.
1 code implementation • 30 Sep 2021 • Jinlin Lai, Justin Domke, Daniel Sheldon
We reveal that the marginal particle filter is obtained from sequential Monte Carlo by applying Rao-Blackwellization operations, which sacrifices the trajectory information for reduced variance and differentiability.
1 code implementation • ICCV 2021 • Cheng Gu, Erik Learned-Miller, Daniel Sheldon, Guillermo Gallego, Pia Bideau
In particular, we model the aligned data as a spatio-temporal Poisson point process.
1 code implementation • CVPR 2019 • Zezhou Cheng, Matheus Gadelha, Subhransu Maji, Daniel Sheldon
The deep image prior was recently introduced as a prior for natural images.
4 code implementations • 26 Jan 2019 • Ryan McKenna, Daniel Sheldon, Gerome Miklau
Many privacy mechanisms reveal high-level information about a data distribution through noisy measurements.
1 code implementation • NeurIPS 2021 • Ryan McKenna, Siddhant Pradhan, Daniel Sheldon, Gerome Miklau
Private-PGM is a recent approach that uses graphical models to represent the data distribution, with complexity proportional to that of exact marginal inference in a graphical model with structure determined by the co-occurrence of variables in the noisy measurements.