Search Results for author: Vikash K. Mansinghka

Found 33 papers, 14 papers with code

Sequential Monte Carlo Learning for Time Series Structure Discovery

1 code implementation13 Jul 2023 Feras A. Saad, Brian J. Patton, Matthew D. Hoffman, Rif A. Saurous, Vikash K. Mansinghka

This paper presents a new approach to automatically discovering accurate models of complex time series data.

Time Series

From Word Models to World Models: Translating from Natural Language to the Probabilistic Language of Thought

1 code implementation22 Jun 2023 Lionel Wong, Gabriel Grand, Alexander K. Lew, Noah D. Goodman, Vikash K. Mansinghka, Jacob Andreas, Joshua B. Tenenbaum

Our architecture integrates two computational tools that have not previously come together: we model thinking with probabilistic programs, an expressive representation for commonsense reasoning; and we model meaning construction with large language models (LLMs), which support broad-coverage translation from natural language utterances to code expressions in a probabilistic programming language.

Probabilistic Programming Relational Reasoning

Differentiating Metropolis-Hastings to Optimize Intractable Densities

1 code implementation13 Jun 2023 Gaurav Arya, Ruben Seyer, Frank Schäfer, Kartik Chandra, Alexander K. Lew, Mathieu Huot, Vikash K. Mansinghka, Jonathan Ragan-Kelley, Christopher Rackauckas, Moritz Schauer

We develop an algorithm for automatic differentiation of Metropolis-Hastings samplers, allowing us to differentiate through probabilistic inference, even if the model has discrete components within it.

Sequential Monte Carlo Steering of Large Language Models using Probabilistic Programs

2 code implementations5 Jun 2023 Alexander K. Lew, Tan Zhi-Xuan, Gabriel Grand, Vikash K. Mansinghka

Even after fine-tuning and reinforcement learning, large language models (LLMs) can be difficult, if not impossible, to control reliably with prompts alone.

Language Modelling Probabilistic Programming +1

$ω$PAP Spaces: Reasoning Denotationally About Higher-Order, Recursive Probabilistic and Differentiable Programs

no code implementations21 Feb 2023 Mathieu Huot, Alexander K. Lew, Vikash K. Mansinghka, Sam Staton

We introduce a new setting, the category of $\omega$PAP spaces, for reasoning denotationally about expressive differentiable and probabilistic programming languages.

Probabilistic Programming

Abstract Interpretation for Generalized Heuristic Search in Model-Based Planning

no code implementations5 Aug 2022 Tan Zhi-Xuan, Joshua B. Tenenbaum, Vikash K. Mansinghka

Domain-general model-based planners often derive their generality by constructing search heuristics through the relaxation or abstraction of symbolic world models.

Solving the Baby Intuitions Benchmark with a Hierarchically Bayesian Theory of Mind

no code implementations4 Aug 2022 Tan Zhi-Xuan, Nishad Gothoskar, Falk Pollok, Dan Gutfreund, Joshua B. Tenenbaum, Vikash K. Mansinghka

To facilitate the development of new models to bridge the gap between machine and human social intelligence, the recently proposed Baby Intuitions Benchmark (arXiv:2102. 11938) provides a suite of tasks designed to evaluate commonsense reasoning about agents' goals and actions that even young infants exhibit.

Few-Shot Learning Imitation Learning

Recursive Monte Carlo and Variational Inference with Auxiliary Variables

1 code implementation5 Mar 2022 Alexander K. Lew, Marco Cusumano-Towner, Vikash K. Mansinghka

A key design constraint when implementing Monte Carlo and variational inference algorithms is that it must be possible to cheaply and exactly evaluate the marginal densities of proposal distributions and variational families.

Astronomy Stochastic Optimization +1

Estimators of Entropy and Information via Inference in Probabilistic Models

no code implementations24 Feb 2022 Feras A. Saad, Marco Cusumano-Towner, Vikash K. Mansinghka

Estimating information-theoretic quantities such as entropy and mutual information is central to many problems in statistics and machine learning, but challenging in high dimensions.

Variational Inference

Hierarchical Infinite Relational Model

1 code implementation16 Aug 2021 Feras A. Saad, Vikash K. Mansinghka

This paper describes the hierarchical infinite relational model (HIRM), a new probabilistic generative model for noisy, sparse, and heterogeneous relational data.

Attribute Density Estimation

Modeling the Mistakes of Boundedly Rational Agents Within a Bayesian Theory of Mind

no code implementations24 Jun 2021 Arwa Alanqary, Gloria Z. Lin, Joie Le, Tan Zhi-Xuan, Vikash K. Mansinghka, Joshua B. Tenenbaum

Here, we extend the Bayesian Theory of Mind framework to model boundedly rational agents who may have mistaken goals, plans, and actions.

Game of Chess

SPPL: Probabilistic Programming with Fast Exact Symbolic Inference

1 code implementation7 Oct 2020 Feras A. Saad, Martin C. Rinard, Vikash K. Mansinghka

We present the Sum-Product Probabilistic Language (SPPL), a new probabilistic programming language that automatically delivers exact solutions to a broad range of probabilistic inference queries.

Fairness Probabilistic Programming +1

PClean: Bayesian Data Cleaning at Scale with Domain-Specific Probabilistic Programming

1 code implementation23 Jul 2020 Alexander K. Lew, Monica Agrawal, David Sontag, Vikash K. Mansinghka

Data cleaning is naturally framed as probabilistic inference in a generative model of ground-truth data and likely errors, but the diversity of real-world error patterns and the hardness of inference make Bayesian approaches difficult to automate.

Probabilistic Programming

Automating Involutive MCMC using Probabilistic and Differentiable Programming

2 code implementations20 Jul 2020 Marco Cusumano-Towner, Alexander K. Lew, Vikash K. Mansinghka

Involutive MCMC is a unifying mathematical construction for MCMC kernels that generalizes many classic and state-of-the-art MCMC algorithms, from reversible jump MCMC to kernels based on deep neural networks.

Computation

Online Bayesian Goal Inference for Boundedly-Rational Planning Agents

1 code implementation13 Jun 2020 Tan Zhi-Xuan, Jordyn L. Mann, Tom Silver, Joshua B. Tenenbaum, Vikash K. Mansinghka

These models are specified as probabilistic programs, allowing us to represent and perform efficient Bayesian inference over an agent's goals and internal planning processes.

Bayesian Inference

Bayesian Synthesis of Probabilistic Programs for Automatic Data Modeling

no code implementations14 Jul 2019 Feras A. Saad, Marco F. Cusumano-Towner, Ulrich Schaechtle, Martin C. Rinard, Vikash K. Mansinghka

These techniques work with probabilistic domain-specific data modeling languages that capture key properties of a broad class of data generating processes, using Bayesian inference to synthesize probabilistic programs in these modeling languages given observed data.

Probabilistic Programming Time Series +1

A Family of Exact Goodness-of-Fit Tests for High-Dimensional Discrete Distributions

no code implementations26 Feb 2019 Feras A. Saad, Cameron E. Freer, Nathanael L. Ackerman, Vikash K. Mansinghka

Unlike most existing test statistics, the proposed test statistic is distribution-free and its exact (non-asymptotic) sampling distribution is known in closed form.

Using probabilistic programs as proposals

no code implementations11 Jan 2018 Marco F. Cusumano-Towner, Vikash K. Mansinghka

Monte Carlo inference has asymptotic guarantees, but can be slow when using generic proposals.

Probabilistic Programming

Temporally-Reweighted Chinese Restaurant Process Mixtures for Clustering, Imputing, and Forecasting Multivariate Time Series

1 code implementation18 Oct 2017 Feras A. Saad, Vikash K. Mansinghka

We apply the technique to challenging forecasting and imputation tasks using seasonal flu data from the US Center for Disease Control and Prevention, demonstrating superior forecasting accuracy and competitive imputation accuracy as compared to multiple widely used baselines.

Clustering Imputation +2

AIDE: An algorithm for measuring the accuracy of probabilistic inference algorithms

no code implementations NeurIPS 2017 Marco F. Cusumano-Towner, Vikash K. Mansinghka

This paper introduces the auxiliary inference divergence estimator (AIDE), an algorithm for measuring the accuracy of approximate inference algorithms.

Variational Inference

Probabilistic programs for inferring the goals of autonomous agents

1 code implementation17 Apr 2017 Marco F. Cusumano-Towner, Alexey Radul, David Wingate, Vikash K. Mansinghka

Intelligent systems sometimes need to infer the probable goals of people, cars, and robots, based on partial observations of their motion.

Encapsulating models and approximate inference programs in probabilistic modules

no code implementations14 Dec 2016 Marco F. Cusumano-Towner, Vikash K. Mansinghka

This paper introduces the probabilistic module interface, which allows encapsulation of complex probabilistic models with latent variables alongside custom stochastic approximate inference machinery, and provides a platform-agnostic abstraction barrier separating the model internals from the host probabilistic inference system.

Measuring the non-asymptotic convergence of sequential Monte Carlo samplers using probabilistic programming

no code implementations7 Dec 2016 Marco F. Cusumano-Towner, Vikash K. Mansinghka

A key limitation of sampling algorithms for approximate inference is that it is difficult to quantify their approximation error.

Probabilistic Programming

A Probabilistic Programming Approach To Probabilistic Data Analysis

no code implementations NeurIPS 2016 Feras Saad, Vikash K. Mansinghka

Probabilistic techniques are central to data analysis, but different approaches can be challenging to apply, combine, and compare.

BIG-bench Machine Learning Clustering +1

Quantifying the probable approximation error of probabilistic inference programs

no code implementations31 May 2016 Marco F. Cusumano-Towner, Vikash K. Mansinghka

This paper introduces a new technique for quantifying the approximation error of a broad class of probabilistic inference programs, including ones based on both variational and Monte Carlo approaches.

Probabilistic Programming with Gaussian Process Memoization

no code implementations17 Dec 2015 Ulrich Schaechtle, Ben Zinberg, Alexey Radul, Kostas Stathis, Vikash K. Mansinghka

Gaussian Processes (GPs) are widely used tools in statistics, machine learning, robotics, computer vision, and scientific computation.

Bayesian Optimization Gaussian Processes +4

JUMP-Means: Small-Variance Asymptotics for Markov Jump Processes

no code implementations1 Mar 2015 Jonathan H. Huggins, Karthik Narasimhan, Ardavan Saeedi, Vikash K. Mansinghka

We derive the small-variance asymptotics for parametric and nonparametric MJPs for both directly observed and hidden state models.

Inverse Graphics with Probabilistic CAD Models

no code implementations4 Jul 2014 Tejas D. Kulkarni, Vikash K. Mansinghka, Pushmeet Kohli, Joshua B. Tenenbaum

We show that it is possible to solve challenging, real-world 3D vision problems by approximate inference in generative models for images based on rendering the outputs of probabilistic CAD (PCAD) programs.

3D Human Pose Estimation Object

Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs

no code implementations NeurIPS 2013 Vikash K. Mansinghka, Tejas D. Kulkarni, Yura N. Perov, Joshua B. Tenenbaum

The idea of computer vision as the Bayesian inverse problem to computer graphics has a long history and an appealing elegance, but it has proved difficult to directly implement.

Probabilistic Programming

ClusterCluster: Parallel Markov Chain Monte Carlo for Dirichlet Process Mixtures

no code implementations8 Apr 2013 Dan Lovell, Jonathan Malmaud, Ryan P. Adams, Vikash K. Mansinghka

Applied to mixture modeling, our approach enables the Dirichlet process to simultaneously learn clusters that describe the data and superclusters that define the granularity of parallelization.

Density Estimation Time Series +1

Cannot find the paper you are looking for? You can Submit a new open access paper.