Mental representations of objects reflect the ways in which we interact with them

22 Jun 2020  ·  Ka Chun Lam, Francisco Pereira, Maryam Vaziri-Pashkam, Kristin Woodard, Emalie McMahon ·

In order to interact with objects in our environment, humans rely on an understanding of the actions that can be performed on them, as well as their properties. When considering concrete motor actions, this knowledge has been called the object affordance. Can this notion be generalized to any type of interaction that one can have with an object? In this paper we introduce a method to represent objects in a space where each dimension corresponds to a broad mode of interaction, based on verb selectional preferences in text corpora. This object embedding makes it possible to predict human judgments of verb applicability to objects better than a variety of alternative approaches. Furthermore, we show that the dimensions in this space can be used to predict categorical and functional dimensions in a state-of-the-art mental representation of objects, derived solely from human judgements of object similarity. These results suggest that interaction knowledge accounts for a large part of mental representations of objects.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here