Auflistung nach Schlagwort "human-robot-interaction"
1 - 2 von 2
Treffer pro Seite
Sortieroptionen
- ZeitschriftenartikelCommunicating Robotic Help Requests: Effects of Eye-Expressions, LED-Lights and Polite Language(i-com: Vol. 19, No. 2, 2020) Westhoven, Martin; Grinten, Tim van derIn this paper we report results from a web- and video-based study on the perception of a request for help from a robot head. Colored lights, eye-expressions and politeness of the used language were varied. We measured effects on expression identification, hedonic user experience, perceived politeness, and help intention. Additionally, sociodemographic data, a ‘face blindness’ questionnaire, and negative attitudes towards robots were collected to control for possible influences on the dependent variables. A total of n = 139 participants were included in the analysis. In this paper, the focus is placed on interaction effects and on the influence of covariates. Significant effects were found for the interaction of LED lighting and eye-expressions and for language and eye-expressions on help intention. The expression identification is significantly influenced by the interaction of LED lighting and eye-expressions. Several significant effects of the covariates were found, both direct and from interaction with independent variables. Especially the negative attitudes towards robots significantly influence help intention and perceived politeness. The results provide information on the effect of different design choices for help requesting robots.
- WorkshopbeitragParameterized Facial Animation for Socially Interactive Robots(Mensch und Computer 2015 – Proceedings, 2015) Wittig, Steffen; Rätsch, Matthias; Kloos, UweSocially interactive robots with human-like speech synthesis and recognition, coupled with humanoid appearance, are an important subject of robotics and artificial intelligence research. Modern solutions have matured enough to provide simple services to human users. To make the interaction with them as fast and intuitive as possible, researchers strive to create transparent interfaces close to human-human interaction. Because facial expressions play a central role in human-human communication, robot faces were implemented with varying degrees of human-likeness and expressiveness. We propose a way to implement a program that believably animates changing facial expressions and allows to influence them via inter-process communication based on an emotion model. This will can be used to create a screen based virtual face for a robotic system with an inviting appearance to stimulate users to seek interaction with the robot.