Why does the robot only select men? How women and men perceive autonomous social robots that have a gender bias
dc.contributor.author | Büttner, Sebastian Thomas | |
dc.contributor.author | Goudarzi, Maral | |
dc.contributor.author | Prilla, Michael | |
dc.date.accessioned | 2024-10-08T15:13:00Z | |
dc.date.available | 2024-10-08T15:13:00Z | |
dc.date.issued | 2024 | |
dc.description.abstract | Future social robots will act autonomously in the world. Autonomous behavior is usually realized by using AI models built with real-world data, which often reflect existing inequalities and prejudices in society. Even if designers do not intend it, there are risks that robots will be developed that discriminate against certain users, e. g. based on gender. In this work, we investigate the implications of a gender-biased robot that disadvantages women, which unfortunately is a bias in AI that is often reported. Our experiment shows that both men and women perceive the gender-biased robot to be unfair. However, our work indicates that women are more aware that a gender bias causes this unfairness. We also show that gender bias results in the robot being perceived differently. While the gender bias resulted in lower likability and intelligence ratings by women, men seem to lose trust in the robot if it behaves unfairly. | en |
dc.identifier.doi | 10.1145/3670653.3677492 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/44865 | |
dc.language.iso | en | |
dc.pubPlace | New York, NY, USA | |
dc.publisher | Association for Computing Machinery | |
dc.relation.ispartof | Proceedings of Mensch und Computer 2024 | |
dc.subject | Experiment | |
dc.subject | Fairness | |
dc.subject | Gender Bias | |
dc.subject | Human-Robot Interaction | |
dc.subject | Social Robot | |
dc.title | Why does the robot only select men? How women and men perceive autonomous social robots that have a gender bias | en |
dc.type | Text/Conference Paper | |
gi.citation.startPage | 479–484 | |
gi.conference.location | Karlsruhe, Germany |