Search Results for author: Vikash Mansinghka

Found 29 papers, 5 papers with code

Understanding Epistemic Language with a Bayesian Theory of Mind

no code implementations21 Aug 2024 Lance Ying, Tan Zhi-Xuan, Lionel Wong, Vikash Mansinghka, Joshua B. Tenenbaum

How do people understand and evaluate claims about others' beliefs, even though these beliefs cannot be directly observed?

Navigate

Infinite Ends from Finite Samples: Open-Ended Goal Inference as Top-Down Bayesian Filtering of Bottom-Up Proposals

no code implementations23 Jul 2024 Tan Zhi-Xuan, Gloria Kang, Vikash Mansinghka, Joshua B. Tenenbaum

The space of human goals is tremendously vast; and yet, from just a few moments of watching a scene or reading a story, we seem to spontaneously infer a range of plausible motivations for the people and characters involved.

Bayesian Inference

Partially Observable Task and Motion Planning with Uncertainty and Risk Awareness

no code implementations15 Mar 2024 Aidan Curtis, George Matheos, Nishad Gothoskar, Vikash Mansinghka, Joshua Tenenbaum, Tomás Lozano-Pérez, Leslie Pack Kaelbling

We propose a strategy for TAMP with Uncertainty and Risk Awareness (TAMPURA) that is capable of efficiently solving long-horizon planning problems with initial-state and action outcome uncertainty, including problems that require information gathering and avoiding undesirable and irreversible outcomes.

Motion Planning Task and Motion Planning

Pragmatic Instruction Following and Goal Assistance via Cooperative Language-Guided Inverse Planning

1 code implementation27 Feb 2024 Tan Zhi-Xuan, Lance Ying, Vikash Mansinghka, Joshua B. Tenenbaum

Our agent assists a human by modeling them as a cooperative planner who communicates joint plans to the assistant, then performs multimodal Bayesian inference over the human's goal from actions and language, using large language models (LLMs) to evaluate the likelihood of an instruction given a hypothesized plan.

Bayesian Inference Instruction Following

Grounding Language about Belief in a Bayesian Theory-of-Mind

no code implementations16 Feb 2024 Lance Ying, Tan Zhi-Xuan, Lionel Wong, Vikash Mansinghka, Joshua Tenenbaum

In this paper, we take a step towards an answer by grounding the semantics of belief statements in a Bayesian theory-of-mind: By modeling how humans jointly infer coherent sets of goals, beliefs, and plans that explain an agent's actions, then evaluating statements about the agent's beliefs against these inferences via epistemic logic, our framework provides a conceptual role semantics for belief, explaining the gradedness and compositionality of human belief attributions, as well as their intimate connection with goals and plans.

Attribute

Inferring the Goals of Communicating Agents from Actions and Instructions

no code implementations28 Jun 2023 Lance Ying, Tan Zhi-Xuan, Vikash Mansinghka, Joshua B. Tenenbaum

When humans cooperate, they frequently coordinate their activity through both verbal communication and non-verbal actions, using this information to infer a shared goal and plan.

SBI: A Simulation-Based Test of Identifiability for Bayesian Causal Inference

no code implementations23 Feb 2021 Sam Witty, David Jensen, Vikash Mansinghka

This paper introduces simulation-based identifiability (SBI), a procedure for testing the identifiability of queries in Bayesian causal inference approaches that are implemented as probabilistic programs.

Causal Inference Experimental Design +1

Online Bayesian Goal Inference for Boundedly Rational Planning Agents

no code implementations NeurIPS 2020 Tan Zhi-Xuan, Jordyn Mann, Tom Silver, Josh Tenenbaum, Vikash Mansinghka

These models are specified as probabilistic programs, allowing us to represent and perform efficient Bayesian inference over an agent's goals and internal planning processes.

Bayesian Inference

Deep Involutive Generative Models for Neural MCMC

no code implementations26 Jun 2020 Span Spanbauer, Cameron Freer, Vikash Mansinghka

We introduce deep involutive generative models, a new architecture for deep generative modeling, and use them to define Involutive Neural MCMC, a new approach to fast neural MCMC.

valid

Bayesian causal inference via probabilistic program synthesis

no code implementations30 Oct 2019 Sam Witty, Alexander Lew, David Jensen, Vikash Mansinghka

This approach makes it straightforward to incorporate data from atomic interventions, as well as shift interventions, variance-scaling interventions, and other interventions that modify causal structure.

Causal Inference Probabilistic Programming +1

Real-time Approximate Bayesian Computation for Scene Understanding

no code implementations22 May 2019 Javier Felip, Nilesh Ahuja, David Gómez-Gutiérrez, Omesh Tickoo, Vikash Mansinghka

The underlying generative models are built from realistic simulation software, wrapped in a Bayesian error model for the gap between simulation outputs and real data.

Scene Understanding

Probabilistic Data Analysis with Probabilistic Programming

1 code implementation18 Aug 2016 Feras Saad, Vikash Mansinghka

This paper introduces composable generative population models (CGPMs), a computational abstraction that extends directed graphical models and can be used to describe and compose a broad class of probabilistic data analysis techniques.

Clustering Dimensionality Reduction +1

BayesDB: A probabilistic programming system for querying the probable implications of data

no code implementations15 Dec 2015 Vikash Mansinghka, Richard Tibbetts, Jay Baxter, Pat Shafto, Baxter Eaves

Is it possible to make statistical inference broadly accessible to non-statisticians without sacrificing mathematical rigor or inference quality?

Probabilistic Programming

CrossCat: A Fully Bayesian Nonparametric Method for Analyzing Heterogeneous, High Dimensional Data

1 code implementation3 Dec 2015 Vikash Mansinghka, Patrick Shafto, Eric Jonas, Cap Petschulat, Max Gasner, Joshua B. Tenenbaum

CrossCat infers multiple non-overlapping views of the data, each consisting of a subset of the variables, and uses a separate nonparametric mixture to model each view.

Bayesian Inference Common Sense Reasoning +1

A New Approach to Probabilistic Programming Inference

no code implementations3 Jul 2015 Frank Wood, Jan Willem van de Meent, Vikash Mansinghka

We introduce and demonstrate a new approach to inference in expressive probabilistic programming languages based on particle Markov chain Monte Carlo.

Probabilistic Programming

Picture: A Probabilistic Programming Language for Scene Perception

no code implementations CVPR 2015 Tejas D. Kulkarni, Pushmeet Kohli, Joshua B. Tenenbaum, Vikash Mansinghka

Recent progress on probabilistic modeling and statistical learning, coupled with the availability of large training datasets, has led to remarkable progress in computer vision.

3D Human Pose Estimation 3D Object Reconstruction +2

Automatic Inference for Inverting Software Simulators via Probabilistic Programming

no code implementations31 May 2015 Ardavan Saeedi, Vlad Firoiu, Vikash Mansinghka

Models of complex systems are often formalized as sequential software simulators: computationally intensive programs that iteratively build up probable system configurations given parameters and initial conditions.

Probabilistic Programming

Particle Gibbs with Ancestor Sampling for Probabilistic Programs

no code implementations27 Jan 2015 Jan-Willem van de Meent, Hongseok Yang, Vikash Mansinghka, Frank Wood

Particle Markov chain Monte Carlo techniques rank among current state-of-the-art methods for probabilistic program inference.

Probabilistic Programming

Sublinear-Time Approximate MCMC Transitions for Probabilistic Programs

no code implementations6 Nov 2014 Yutian Chen, Vikash Mansinghka, Zoubin Ghahramani

Probabilistic programming languages can simplify the development of machine learning techniques, but only if inference is sufficiently scalable.

Probabilistic Programming

Venture: a higher-order probabilistic programming platform with programmable inference

no code implementations1 Apr 2014 Vikash Mansinghka, Daniel Selsam, Yura Perov

Like Church, probabilistic models and inference problems in Venture are specified via a Turing-complete, higher-order probabilistic language descended from Lisp.

Probabilistic Programming Variational Inference

Variational Particle Approximations

no code implementations24 Feb 2014 Ardavan Saeedi, Tejas D. Kulkarni, Vikash Mansinghka, Samuel Gershman

Like Monte Carlo, DPVI can handle multiple modes, and yields exact results in a well-defined limit.

Spike Sorting Variational Inference

Building fast Bayesian computing machines out of intentionally stochastic, digital parts

no code implementations20 Feb 2014 Vikash Mansinghka, Eric Jonas

Here we show how to build fast Bayesian computing machines using intentionally stochastic, digital parts, narrowing this efficiency gap by multiple orders of magnitude.

Bayesian Inference

Church: a language for generative models

no code implementations13 Jun 2012 Noah Goodman, Vikash Mansinghka, Daniel M. Roy, Keith Bonawitz, Joshua B. Tenenbaum

We introduce Church, a universal language for describing stochastic generative processes.

Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.