Winkler,MareikeKöhne,SonjaKlöpper,MiriamDemmler, DanielKrupka, DanielFederrath, Hannes2022-09-282022-09-282022978-3-88579-720-3https://dl.gi.de/handle/20.500.12116/39484In the public sector, the use of algorithmic decision-making (ADM) systems can be directly linked to crucial state assistance, such as welfare benefits. Prominent examples such as an algorithm of the Public Employment Service Austria, that predicted below-average placement chances for women, underline the high risks of systematic gender discrimination. The use of ADM is rather novel in the public sector. The private sector, on the other hand, can resort to a relative wealth of experience in adopting such algorithms and dealing with algorithmic gender discrimination, for example in recruiting. Based on empirical examples our paper 1) explores how gender is currently considered in the development of ADM for the public sector, 2) highlights the potential risks of algorithmic gender discrimination, and 3) analyzes how the public sector can learn from the experience of the private sector in mitigating these risks.eneGovalgorithmic decision-makingautomation biasalgorithmic biasgender discrimination‘Not all algorithms!' Lessons from the Private Sector on Mitigating Gender Discrimination10.18420/inf2022_1101617-5468