Logo des Repositoriums
 
Konferenzbeitrag

Why does the robot only select men? How women and men perceive autonomous social robots that have a gender bias

Vorschaubild nicht verfügbar

Volltext URI

Dokumententyp

Text/Conference Paper

Zusatzinformation

Datum

2024

Zeitschriftentitel

ISSN der Zeitschrift

Bandtitel

Verlag

Association for Computing Machinery

Zusammenfassung

Future social robots will act autonomously in the world. Autonomous behavior is usually realized by using AI models built with real-world data, which often reflect existing inequalities and prejudices in society. Even if designers do not intend it, there are risks that robots will be developed that discriminate against certain users, e. g. based on gender. In this work, we investigate the implications of a gender-biased robot that disadvantages women, which unfortunately is a bias in AI that is often reported. Our experiment shows that both men and women perceive the gender-biased robot to be unfair. However, our work indicates that women are more aware that a gender bias causes this unfairness. We also show that gender bias results in the robot being perceived differently. While the gender bias resulted in lower likability and intelligence ratings by women, men seem to lose trust in the robot if it behaves unfairly.

Beschreibung

Büttner, Sebastian Thomas; Goudarzi, Maral; Prilla, Michael (2024): Why does the robot only select men? How women and men perceive autonomous social robots that have a gender bias. Proceedings of Mensch und Computer 2024. DOI: 10.1145/3670653.3677492. Association for Computing Machinery. pp. 479–484. Karlsruhe, Germany

Zitierform

Tags