Symbol Grounding as Social, Situated Construction of Meaning in Human-Robot Interaction
|Kruijff, Geert-Jan M.
|The paper views the issue of “symbol grounding” from the viewpoint of the construction of meaning between humans and robots, in the context of a collaborative activity. This concerns a core aspect of the formation of common ground: The construction of meaning between actors as a conceptual representation which is believed to be mutually understood as referring to a particular aspect of reality. The problem in this construction is that experience is inherently subjective—and more specifically, humans and robots experience and understand reality fundamentally differently. There is an inherent asymmetry between the actors involved. The paper focuses on how this asymmetry can be reflected logically, and particularly in the underlying model theory. The point is to make it possible for a robot to reason explicitly both about such asymmetry in understanding, consider possibilities for alignment to deal with it, and establish (from its viewpoint) a level of intersubjective or mutual understanding. Key to the approach taken in the paper is to consider conceptual representations to be formulas over propositions which are based in proofs, as reasoned explanations of experience. This shifts the focus from a notion of “truth” to a notion of judgment—judgments which can be subjectively right and still intersubjectively wrong (faultless disagreement), and which can evolve over time (updates, revision). The result is an approach which accommodates both asymmetric agency and social sentience, modelling symbol grounding in human-robot interaction as social, situated construction over time.
|KI - Künstliche Intelligenz: Vol. 27, No. 2
|KI - Künstliche Intelligenz
|Symbol Grounding as Social, Situated Construction of Meaning in Human-Robot Interaction