Logo des Repositoriums
 

Why does the robot only select men? How women and men perceive autonomous social robots that have a gender bias

dc.contributor.authorBüttner, Sebastian Thomas
dc.contributor.authorGoudarzi, Maral
dc.contributor.authorPrilla, Michael
dc.date.accessioned2024-10-08T15:13:00Z
dc.date.available2024-10-08T15:13:00Z
dc.date.issued2024
dc.description.abstractFuture social robots will act autonomously in the world. Autonomous behavior is usually realized by using AI models built with real-world data, which often reflect existing inequalities and prejudices in society. Even if designers do not intend it, there are risks that robots will be developed that discriminate against certain users, e. g. based on gender. In this work, we investigate the implications of a gender-biased robot that disadvantages women, which unfortunately is a bias in AI that is often reported. Our experiment shows that both men and women perceive the gender-biased robot to be unfair. However, our work indicates that women are more aware that a gender bias causes this unfairness. We also show that gender bias results in the robot being perceived differently. While the gender bias resulted in lower likability and intelligence ratings by women, men seem to lose trust in the robot if it behaves unfairly.en
dc.identifier.doi10.1145/3670653.3677492
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/44865
dc.language.isoen
dc.pubPlaceNew York, NY, USA
dc.publisherAssociation for Computing Machinery
dc.relation.ispartofProceedings of Mensch und Computer 2024
dc.subjectExperiment
dc.subjectFairness
dc.subjectGender Bias
dc.subjectHuman-Robot Interaction
dc.subjectSocial Robot
dc.titleWhy does the robot only select men? How women and men perceive autonomous social robots that have a gender biasen
dc.typeText/Conference Paper
gi.citation.startPage479–484
gi.conference.locationKarlsruhe, Germany

Dateien