Auflistung nach Schlagwort "algorithmic bias"
1 - 2 von 2
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragData Leakage Through Click Data in Virtual Learning Environments(20. Fachtagung Bildungstechnologien (DELFI), 2022) Hartmann, Johanna; Heuer, Hendrik; Breiter, AndreasUnsupervised machine learning techniques are increasingly used to cluster students based on their activity in virtual learning environments. It is commonly assumed that clusters formed by click data merely represent the actions of users and do not allow inferring personal information about individual users. Based on an analysis of 18,660 students and 5.56 million data points from the Open University Learning Analytics Dataset, we show that clusters trained on "raw" click data are highly correlated with personal information like student success, course specifics, and student demographics. Our analysis demonstrates that these clusters allow conclusions about demographic variables like the previous education and the affluence of the residential area. Our investigation shows that apparently, objective click data can leak private attributes. The paper discusses the implications of this for the design of virtual learning environments, especially considering the legal requirements posed by the principle of data minimization of the EU GDPR.
- Textdokument‘Not all algorithms!' Lessons from the Private Sector on Mitigating Gender Discrimination(INFORMATIK 2022, 2022) Winkler,Mareike; Köhne,Sonja; Klöpper,MiriamIn the public sector, the use of algorithmic decision-making (ADM) systems can be directly linked to crucial state assistance, such as welfare benefits. Prominent examples such as an algorithm of the Public Employment Service Austria, that predicted below-average placement chances for women, underline the high risks of systematic gender discrimination. The use of ADM is rather novel in the public sector. The private sector, on the other hand, can resort to a relative wealth of experience in adopting such algorithms and dealing with algorithmic gender discrimination, for example in recruiting. Based on empirical examples our paper 1) explores how gender is currently considered in the development of ADM for the public sector, 2) highlights the potential risks of algorithmic gender discrimination, and 3) analyzes how the public sector can learn from the experience of the private sector in mitigating these risks.