We reveal that the marginal particle filter is obtained from sequential Monte Carlo by applying Rao-Blackwellization operations, which sacrifices the trajectory information for reduced variance and differentiability.
Private-PGM is a recent approach that uses graphical models to represent the data distribution, with complexity proportional to that of exact marginal inference in a graphical model with structure determined by the co-occurrence of variables in the noisy measurements.
Structured kernel interpolation (SKI) is among the most scalable methods: by placing inducing points on a dense grid and using structured matrix algebra, SKI achieves per-iteration time of O(n + m log m) for approximate inference.
However, it will remove intrinsic variability if the variables are dependent, and therefore does not apply to many situations, including modeling of species counts that are controlled by common causes.
The combination of these algorithmic components significantly advances the state-of-the-art "out of the box" variational inference.
The goal of this paper is to develop a practical and general-purpose approach to construct confidence intervals for differentially private parametric estimation.
The US weather radar archive holds detailed information about biological phenomena in the atmosphere over the last 20 years.
Recent work in variational inference (VI) uses ideas from Monte Carlo estimation to tighten the lower bounds on the log-likelihood that are used as objectives.
We develop nested automatic differentiation (AD) algorithms for exact inference and learning in integer latent variable models.
A naive learning algorithm that uses the noisy sufficient statistics “as is” outperforms general-purpose differentially private learning algorithms.
We investigate the problem of learning discrete, undirected graphical models in a differentially private way.
We therefore address the robust river network design problem where the goal is to optimize river connectivity for fish movement by removing barriers.
Many inference problems in structured prediction are naturally solved by augmenting a tractable dependency structure with complex, non-local auxiliary objectives.
The Collective Graphical Model (CGM) models a population of independent and identically distributed individuals when only collective statistics (i. e., counts of individuals) are observed.