no code implementations • 17 Oct 2023 • Leonardo Hernandez Cano, Yewen Pu, Robert D. Hawkins, Josh Tenenbaum, Armando Solar-Lezama
Compared to learning from demonstrations or experiences, programmatic learning allows the machine to acquire a novel skill as soon as the program is written, and, by building a library of programs, a machine can quickly learn how to perform complex tasks.
no code implementations • 26 May 2023 • Kartik Chandra, Tzu-Mao Li, Josh Tenenbaum, Jonathan Ragan-Kelley
Great storytellers know how to take us on a journey.
1 code implementation • 4 Oct 2022 • Zhijing Jin, Sydney Levine, Fernando Gonzalez, Ojasv Kamal, Maarten Sap, Mrinmaya Sachan, Rada Mihalcea, Josh Tenenbaum, Bernhard Schölkopf
Using a state-of-the-art large language model (LLM) as a basis, we propose a novel moral chain of thought (MORALCOT) prompting strategy that combines the strengths of LLMs with theories of moral reasoning developed in cognitive science to predict human moral judgments.
no code implementations • 2 Dec 2021 • Yingdong Qian, Marta Kryven, Tao Gao, Hanbyul Joo, Josh Tenenbaum
We describe Generative Body Kinematics model, which predicts human intention inference in this domain using Bayesian inverse planning and inverse body kinematics.
no code implementations • NeurIPS 2021 • Kai Xu, Akash Srivastava, Dan Gutfreund, Felix Sosa, Tomer Ullman, Josh Tenenbaum, Charles Sutton
In this paper, we propose a Bayesian-symbolic framework (BSP) for physical reasoning and learning that is close to human-level sample-efficiency and accuracy.
no code implementations • 16 Apr 2021 • Matthias Hofer, Tuan Anh Le, Roger Levy, Josh Tenenbaum
Humans have the ability to rapidly understand rich combinatorial concepts from limited data.
no code implementations • NeurIPS 2020 • Tan Zhi-Xuan, Jordyn Mann, Tom Silver, Josh Tenenbaum, Vikash Mansinghka
These models are specified as probabilistic programs, allowing us to represent and perform efficient Bayesian inference over an agent's goals and internal planning processes.
no code implementations • 25 Oct 2020 • Akash Srivastava, Yamini Bansal, Yukun Ding, Cole Hurwitz, Kai Xu, Bernhard Egger, Prasanna Sattigeri, Josh Tenenbaum, David D. Cox, Dan Gutfreund
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the (aggregate) posterior to encourage statistical independence of the latent factors.
no code implementations • NeurIPS 2020 • Yewen Pu, Kevin Ellis, Marta Kryven, Josh Tenenbaum, Armando Solar-Lezama
Given a specification, we score a candidate program both on its consistency with the specification, and also whether a rational speaker would chose this particular specification to communicate that program.
no code implementations • ICLR 2020 • Akash Srivastava, Yamini Bansal, Yukun Ding, Bernhard Egger, Prasanna Sattigeri, Josh Tenenbaum, David D. Cox, Dan Gutfreund
In this work, we tackle a slightly more intricate scenario where the observations are generated from a conditional distribution of some known control variate and some latent noise variate.
no code implementations • NeurIPS 2019 • Andrei Barbu, David Mayo, Julian Alverio, William Luo, Christopher Wang, Dan Gutfreund, Josh Tenenbaum, Boris Katz
Although we focus on object recognition here, data with controls can be gathered at scale using automated tools throughout machine learning to generate datasets that exercise models in new ways thus providing valuable feedback to researchers.
Ranked #50 on
Image Classification
on ObjectNet
(using extra training data)
1 code implementation • NeurIPS 2019 • Kevin Smith, Lingjie Mei, Shunyu Yao, Jiajun Wu, Elizabeth Spelke, Josh Tenenbaum, Tomer Ullman
We also present a new test set for measuring violations of physical expectations, using a range of scenarios derived from developmental psychology.
no code implementations • NeurIPS 2019 • Kevin Ellis, Maxwell Nye, Yewen Pu, Felix Sosa, Josh Tenenbaum, Armando Solar-Lezama
We present a neural program synthesis approach integrating components which write, execute, and assess code to navigate the search space of possible programs.
no code implementations • 12 Apr 2019 • Tom Silver, Kelsey R. Allen, Alex K. Lew, Leslie Pack Kaelbling, Josh Tenenbaum
We propose an expressive class of policies, a strong but general prior, and a learning algorithm that, together, can learn interesting policies from very few examples.
1 code implementation • 15 Dec 2018 • Tom Silver, Kelsey Allen, Josh Tenenbaum, Leslie Kaelbling
In these tasks, reinforcement learning from scratch remains data-inefficient or intractable, but learning a residual on top of the initial controller can yield substantial improvements.
1 code implementation • NeurIPS 2018 • Filipe de Avila Belbute-Peres, Kevin Smith, Kelsey Allen, Josh Tenenbaum, J. Zico Kolter
We present a differentiable physics engine that can be integrated as a module in deep neural networks for end-to-end learning.
no code implementations • NeurIPS 2018 • Kevin Ellis, Lucas Morales, Mathias Sablé-Meyer, Armando Solar-Lezama, Josh Tenenbaum
Successful approaches to program induction require a hand-engineered domain-specific language (DSL), constraining the space of allowed programs and imparting prior knowledge of the domain.
1 code implementation • NeurIPS 2018 • Jun-Yan Zhu, Zhoutong Zhang, Chengkai Zhang, Jiajun Wu, Antonio Torralba, Josh Tenenbaum, Bill Freeman
The VON not only generates images that are more realistic than the state-of-the-art 2D image synthesis methods but also enables many 3D operations such as changing the viewpoint of a generated image, shape and texture editing, linear interpolation in texture and shape space, and transferring appearance across different objects and viewpoints.
no code implementations • NeurIPS 2018 • Yilun Du, Zhijian Liu, Hector Basevi, Ales Leonardis, Bill Freeman, Josh Tenenbaum, Jiajun Wu
We first show that applying physics supervision to an existing scene understanding model increases performance, produces more stable predictions, and allows training to an equivalent performance level with fewer annotated training examples.
no code implementations • 16 Oct 2018 • Vlad Firoiu, Tina Ju, Josh Tenenbaum
There has been a recent explosion in the capabilities of game-playing artificial intelligence.
no code implementations • 6 Sep 2018 • Andres Campero, Aldo Pareja, Tim Klinger, Josh Tenenbaum, Sebastian Riedel
Our approach is neuro-symbolic in the sense that the rule pred- icates and core facts are given dense vector representations.
1 code implementation • NeurIPS 2018 • DJ Strouse, Max Kleiman-Weiner, Josh Tenenbaum, Matt Botvinick, David Schwab
We show how to optimize these regularizers in a way that is easy to integrate with policy gradient reinforcement learning.
Multi-agent Reinforcement Learning
reinforcement-learning
+1
no code implementations • 12 Jan 2018 • Richard Kim, Max Kleiman-Weiner, Andres Abeliuk, Edmond Awad, Sohan Dsouza, Josh Tenenbaum, Iyad Rahwan
We introduce a new computational model of moral decision making, drawing on a recent theory of commonsense moral learning via social dynamics.
no code implementations • NeurIPS 2017 • Zhoutong Zhang, Qiujia Li, Zhengjia Huang, Jiajun Wu, Josh Tenenbaum, Bill Freeman
Hearing an object falling onto the ground, humans can recover rich information including its rough shape, material, and falling height.
no code implementations • NeurIPS 2017 • Jiajun Wu, Erika Lu, Pushmeet Kohli, Bill Freeman, Josh Tenenbaum
At the core of our system is a physical world representation that is first recovered by a perception module and then utilized by physics and graphics engines.
no code implementations • 25 Jul 2017 • Ilker Yildirim, Tobias Gerstenberg, Basil Saeed, Marc Toussaint, Josh Tenenbaum
In Experiment~2, we asked participants online to judge whether they think the person in the lab used one or two hands.
no code implementations • NeurIPS 2016 • Eric Schulz, Josh Tenenbaum, David K. Duvenaud, Maarten Speekenbrink, Samuel J. Gershman
How do people learn about complex functional structure?
no code implementations • NeurIPS 2016 • Kevin Ellis, Armando Solar-Lezama, Josh Tenenbaum
Towards learning programs from data, we introduce the problem of sampling programs from posterior distributions conditioned on that data.
no code implementations • NeurIPS 2015 • Mathew Monfort, Brenden M. Lake, Brian Ziebart, Patrick Lucey, Josh Tenenbaum
Recent machine learning methods for sequential behavior prediction estimate the motives of behavior rather than the behavior itself.
no code implementations • NeurIPS 2015 • Kevin Ellis, Armando Solar-Lezama, Josh Tenenbaum
We introduce an unsupervised learning algorithmthat combines probabilistic modeling with solver-based techniques for program synthesis. We apply our techniques to both a visual learning domain and a language learning problem, showing that our algorithm can learn many visual concepts from only a few examplesand that it can recover some English inflectional morphology. Taken together, these results give both a new approach to unsupervised learning of symbolic compositional structures, and a technique for applying program synthesis tools to noisy data.
no code implementations • NeurIPS 2015 • Jiajun Wu, Ilker Yildirim, Joseph J. Lim, Bill Freeman, Josh Tenenbaum
Humans demonstrate remarkable abilities to predict physical events in dynamic scenes, and to infer the physical properties of objects from static images.
no code implementations • NeurIPS 2013 • Brenden M. Lake, Ruslan R. Salakhutdinov, Josh Tenenbaum
People can learn a new visual class from just one example, yet machine learning algorithms typically require hundreds or thousands of examples to tackle the same problems.