piwik-script

Intern
Lehrstuhl für Psychologie III

Interpretation of pointing gestures

Interpretation of pointing gestures

Oliver Herbort

When we want to direct the attention of another person to a specific location or object, we often use pointing gestures. Pointing gestures are ubiquitously in the communication between humans and get increasingly relevant in human-computer and human-robot interaction. They facilitate interactions by complementing or replacing speech when it is difficult to indicate objects verbally (e.g. pointing at a specific star in the night sky). To understand such pointing gestures, an observer has to determine – among other things – which spatial location a pointer wants to indicate. Interestingly, how people extract the pointed-at position from another person’s pointing gesture is not well understood. With the proposed project, we systematically examine how important variables affect the spatial interpretation of pointing gestures in a highly controlled virtual reality environment. We expect that this enables to predict accurately how a given pointing gestures will be interpreted by a specific observer and which pointing gestures is best suited to single out a specific object or location. The results may be applied to improve embodied interaction in real-life, virtual or robotic systems, and provide a testbed for basic mechanisms of human perception. This research is partly supported by the German Research Council (DFG).

Herbort, O., & Kunde, W. (2018). How to point and to interpret pointing gestures? Instructions can reduce pointer-observer misunderstandings. Psychological Research,  82(2), 395-406. doi:10.1007/s00426-016-0824-8

Herbort, O. & Kunde, W. (2016). Spatial (mis-)interpretation of pointing gestures to distal referents. Journal of Experimental Psychology: Human Perception and Performance, 42(1), 78-89. doi: 10.1037/xhp0000126