Auflistung nach Schlagwort "Gender Bias"
1 - 2 von 2
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragUntraining Ethnocentric Biases about Gender Roles: A Preliminary Empirical Study Presenting Art as Stimulus(Mensch und Computer 2022 - Tagungsband, 2022) Albaghli, Reem; Chang, Chaz; Almahmeed, Sarah; Attar, NadaHuman interaction with art and how ethnocentric and gender biases apply in this context via the use of human-computer interface design is poorly studied to date. This study leverages art as stimulus to untrain gender bias. The interface includes digital representations of a database of 19th century Middle Eastern paintings by European artists. We categorized images that portrayed females or males in an intellectualized state, e.g., reading, playing musical instruments, etc. (Category FI or MI) or paintings of females only posing in the picture (Category FO). We cropped the original images and gave participants two unrelated choices to select the best fit from their own perspective. We ran an experiment with 3 blocks, where in the first block the participant was randomly shown an equal number of images from all categories, in the second block we only showed Category FI female intellectuals, and then in the third block a repeat of block one. Across participants from 4 English-speaking countries with female and male participants, we found a bias towards non-intellectual images for females which diminished after the middle (training) block. This study offers quantitative insight into measuring biases, thoughtful interaction with art as stimulus, and how we can start to untrain these ethnocentric or gender biases.
- KonferenzbeitragWhy does the robot only select men? How women and men perceive autonomous social robots that have a gender bias(Proceedings of Mensch und Computer 2024, 2024) Büttner, Sebastian Thomas; Goudarzi, Maral; Prilla, MichaelFuture social robots will act autonomously in the world. Autonomous behavior is usually realized by using AI models built with real-world data, which often reflect existing inequalities and prejudices in society. Even if designers do not intend it, there are risks that robots will be developed that discriminate against certain users, e. g. based on gender. In this work, we investigate the implications of a gender-biased robot that disadvantages women, which unfortunately is a bias in AI that is often reported. Our experiment shows that both men and women perceive the gender-biased robot to be unfair. However, our work indicates that women are more aware that a gender bias causes this unfairness. We also show that gender bias results in the robot being perceived differently. While the gender bias resulted in lower likability and intelligence ratings by women, men seem to lose trust in the robot if it behaves unfairly.