Search Results for author: Josh Tenenbaum

Found 30 papers, 5 papers with code

Modeling human intention inference in continuous 3D domains by inverse planning and body kinematics

no code implementations2 Dec 2021 Yingdong Qian, Marta Kryven, Tao Gao, Hanbyul Joo, Josh Tenenbaum

We describe Generative Body Kinematics model, which predicts human intention inference in this domain using Bayesian inverse planning and inverse body kinematics.

A Bayesian-Symbolic Approach to Reasoning and Learning in Intuitive Physics

no code implementations NeurIPS 2021 Kai Xu, Akash Srivastava, Dan Gutfreund, Felix Sosa, Tomer Ullman, Josh Tenenbaum, Charles Sutton

In this paper, we propose a Bayesian-symbolic framework (BSP) for physical reasoning and learning that is close to human-level sample-efficiency and accuracy.

Bayesian Inference Bilevel Optimization +1

Online Bayesian Goal Inference for Boundedly Rational Planning Agents

no code implementations NeurIPS 2020 Tan Zhi-Xuan, Jordyn Mann, Tom Silver, Josh Tenenbaum, Vikash Mansinghka

These models are specified as probabilistic programs, allowing us to represent and perform efficient Bayesian inference over an agent's goals and internal planning processes.

Bayesian Inference

Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modelling

no code implementations25 Oct 2020 Akash Srivastava, Yamini Bansal, Yukun Ding, Cole Hurwitz, Kai Xu, Bernhard Egger, Prasanna Sattigeri, Josh Tenenbaum, David D. Cox, Dan Gutfreund

Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the (aggregate) posterior to encourage statistical independence of the latent factors.

Disentanglement

Program Synthesis with Pragmatic Communication

no code implementations NeurIPS 2020 Yewen Pu, Kevin Ellis, Marta Kryven, Josh Tenenbaum, Armando Solar-Lezama

Given a specification, we score a candidate program both on its consistency with the specification, and also whether a rational speaker would chose this particular specification to communicate that program.

Program Synthesis

CZ-GEM: A FRAMEWORK FOR DISENTANGLED REPRESENTATION LEARNING

no code implementations ICLR 2020 Akash Srivastava, Yamini Bansal, Yukun Ding, Bernhard Egger, Prasanna Sattigeri, Josh Tenenbaum, David D. Cox, Dan Gutfreund

In this work, we tackle a slightly more intricate scenario where the observations are generated from a conditional distribution of some known control variate and some latent noise variate.

Disentanglement

ObjectNet: A large-scale bias-controlled dataset for pushing the limits of object recognition models

no code implementations NeurIPS 2019 Andrei Barbu, David Mayo, Julian Alverio, William Luo, Christopher Wang, Dan Gutfreund, Josh Tenenbaum, Boris Katz

Although we focus on object recognition here, data with controls can be gathered at scale using automated tools throughout machine learning to generate datasets that exercise models in new ways thus providing valuable feedback to researchers.

Ranked #4 on Image Classification on ObjectNet (using extra training data)

Image Classification Object Recognition

Modeling Expectation Violation in Intuitive Physics with Coarse Probabilistic Object Representations

1 code implementation NeurIPS 2019 Kevin Smith, Lingjie Mei, Shunyu Yao, Jiajun Wu, Elizabeth Spelke, Josh Tenenbaum, Tomer Ullman

We also present a new test set for measuring violations of physical expectations, using a range of scenarios derived from developmental psychology.

Scene Understanding

Write, Execute, Assess: Program Synthesis with a REPL

no code implementations NeurIPS 2019 Kevin Ellis, Maxwell Nye, Yewen Pu, Felix Sosa, Josh Tenenbaum, Armando Solar-Lezama

We present a neural program synthesis approach integrating components which write, execute, and assess code to navigate the search space of possible programs.

Program Synthesis

Few-Shot Bayesian Imitation Learning with Logical Program Policies

no code implementations12 Apr 2019 Tom Silver, Kelsey R. Allen, Alex K. Lew, Leslie Pack Kaelbling, Josh Tenenbaum

We propose an expressive class of policies, a strong but general prior, and a learning algorithm that, together, can learn interesting policies from very few examples.

Bayesian Inference Imitation Learning +1

Residual Policy Learning

1 code implementation15 Dec 2018 Tom Silver, Kelsey Allen, Josh Tenenbaum, Leslie Kaelbling

In these tasks, reinforcement learning from scratch remains data-inefficient or intractable, but learning a residual on top of the initial controller can yield substantial improvements.

reinforcement-learning

End-to-End Differentiable Physics for Learning and Control

1 code implementation NeurIPS 2018 Filipe de Avila Belbute-Peres, Kevin Smith, Kelsey Allen, Josh Tenenbaum, J. Zico Kolter

We present a differentiable physics engine that can be integrated as a module in deep neural networks for end-to-end learning.

Visual Object Networks: Image Generation with Disentangled 3D Representations

1 code implementation NeurIPS 2018 Jun-Yan Zhu, Zhoutong Zhang, Chengkai Zhang, Jiajun Wu, Antonio Torralba, Josh Tenenbaum, Bill Freeman

The VON not only generates images that are more realistic than the state-of-the-art 2D image synthesis methods but also enables many 3D operations such as changing the viewpoint of a generated image, shape and texture editing, linear interpolation in texture and shape space, and transferring appearance across different objects and viewpoints.

Image Generation

Learning to Exploit Stability for 3D Scene Parsing

no code implementations NeurIPS 2018 Yilun Du, Zhijian Liu, Hector Basevi, Ales Leonardis, Bill Freeman, Josh Tenenbaum, Jiajun Wu

We first show that applying physics supervision to an existing scene understanding model increases performance, produces more stable predictions, and allows training to an equivalent performance level with fewer annotated training examples.

Scene Understanding Translation

Learning Libraries of Subroutines for Neurally–Guided Bayesian Program Induction

no code implementations NeurIPS 2018 Kevin Ellis, Lucas Morales, Mathias Sablé-Meyer, Armando Solar-Lezama, Josh Tenenbaum

Successful approaches to program induction require a hand-engineered domain-specific language (DSL), constraining the space of allowed programs and imparting prior knowledge of the domain.

Program induction

At Human Speed: Deep Reinforcement Learning with Action Delay

no code implementations16 Oct 2018 Vlad Firoiu, Tina Ju, Josh Tenenbaum

There has been a recent explosion in the capabilities of game-playing artificial intelligence.

Board Games reinforcement-learning

Logical Rule Induction and Theory Learning Using Neural Theorem Proving

no code implementations6 Sep 2018 Andres Campero, Aldo Pareja, Tim Klinger, Josh Tenenbaum, Sebastian Riedel

Our approach is neuro-symbolic in the sense that the rule pred- icates and core facts are given dense vector representations.

Automated Theorem Proving

A Computational Model of Commonsense Moral Decision Making

no code implementations12 Jan 2018 Richard Kim, Max Kleiman-Weiner, Andres Abeliuk, Edmond Awad, Sohan Dsouza, Josh Tenenbaum, Iyad Rahwan

We introduce a new computational model of moral decision making, drawing on a recent theory of commonsense moral learning via social dynamics.

Autonomous Vehicles Decision Making

Learning to See Physics via Visual De-animation

no code implementations NeurIPS 2017 Jiajun Wu, Erika Lu, Pushmeet Kohli, Bill Freeman, Josh Tenenbaum

At the core of our system is a physical world representation that is first recovered by a perception module and then utilized by physics and graphics engines.

Future prediction

Shape and Material from Sound

no code implementations NeurIPS 2017 Zhoutong Zhang, Qiujia Li, Zhengjia Huang, Jiajun Wu, Josh Tenenbaum, Bill Freeman

Hearing an object falling onto the ground, humans can recover rich information including its rough shape, material, and falling height.

Physical problem solving: Joint planning with symbolic, geometric, and dynamic constraints

no code implementations25 Jul 2017 Ilker Yildirim, Tobias Gerstenberg, Basil Saeed, Marc Toussaint, Josh Tenenbaum

In Experiment~2, we asked participants online to judge whether they think the person in the lab used one or two hands.

Sampling for Bayesian Program Learning

no code implementations NeurIPS 2016 Kevin Ellis, Armando Solar-Lezama, Josh Tenenbaum

Towards learning programs from data, we introduce the problem of sampling programs from posterior distributions conditioned on that data.

Program Synthesis

Softstar: Heuristic-Guided Probabilistic Inference

no code implementations NeurIPS 2015 Mathew Monfort, Brenden M. Lake, Brian Ziebart, Patrick Lucey, Josh Tenenbaum

Recent machine learning methods for sequential behavior prediction estimate the motives of behavior rather than the behavior itself.

Galileo: Perceiving Physical Object Properties by Integrating a Physics Engine with Deep Learning

no code implementations NeurIPS 2015 Jiajun Wu, Ilker Yildirim, Joseph J. Lim, Bill Freeman, Josh Tenenbaum

Humans demonstrate remarkable abilities to predict physical events in dynamic scenes, and to infer the physical properties of objects from static images.

Scene Understanding

Unsupervised Learning by Program Synthesis

no code implementations NeurIPS 2015 Kevin Ellis, Armando Solar-Lezama, Josh Tenenbaum

We introduce an unsupervised learning algorithmthat combines probabilistic modeling with solver-based techniques for program synthesis. We apply our techniques to both a visual learning domain and a language learning problem, showing that our algorithm can learn many visual concepts from only a few examplesand that it can recover some English inflectional morphology. Taken together, these results give both a new approach to unsupervised learning of symbolic compositional structures, and a technique for applying program synthesis tools to noisy data.

Program Synthesis

One-shot learning by inverting a compositional causal process

no code implementations NeurIPS 2013 Brenden M. Lake, Ruslan R. Salakhutdinov, Josh Tenenbaum

People can learn a new visual class from just one example, yet machine learning algorithms typically require hundreds or thousands of examples to tackle the same problems.

General Classification One-Shot Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.