Knowledge Representation for Conceptual, Motivational, and Affective Processes in Natural Language Communication

26 Sep 2022  ·  Seng-Beng Ho, Zhaoxia Wang, Boon-Kiat Quek, Erik Cambria ·

Natural language communication is an intricate and complex process. The speaker usually begins with an intention and motivation of what is to be communicated, and what effects are expected from the communication, while taking into consideration the listener's mental model to concoct an appropriate sentence. The listener likewise has to interpret what the speaker means, and respond accordingly, also with the speaker's mental state in mind. To do this successfully, conceptual, motivational, and affective processes have to be represented appropriately to drive the language generation and understanding processes. Language processing has succeeded well with the big data approach in applications such as chatbots and machine translation. However, in human-robot collaborative social communication and in using natural language for delivering precise instructions to robots, a deeper representation of the conceptual, motivational, and affective processes is needed. This paper capitalizes on the UGALRS (Unified General Autonomous and Language Reasoning System) framework and the CD+ (Conceptual Representation Plus) representational scheme to illustrate how social communication through language is supported by a knowledge representational scheme that handles conceptual, motivational, and affective processes in a deep and general way. Though a small set of concepts, motivations, and emotions is treated in this paper, its main contribution is in articulating a general framework of knowledge representation and processing to link these aspects together in serving the purpose of natural language communication for an intelligent system.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here