de / en

Ethics of Data Work

This research project is about developing fair working systems for data workers and ethical guidelines for researchers.


AI systems are powered by people who usually work under precarious conditions. This is particularly true for data work, clickwork or crowdwork, i.e. the work involved in checking algorithms and creating data sets to train and validate AI systems. This work is often used in both commercial and academic institutions, such as universities and research institutes, to advance AI research or use AI for research purposes. Globally, there are between 154 and 435 million data workers enabling much of our machine learning work, and the growth of this profession shows no signs of slowing down.


Despite numerous scientific and journalistic publications addressing the harms of exploitative data labor, and despite the fact that workers are campaigning for better working conditions, best practice examples and ethical guidelines for working with data workers are still lacking - even in the standard guidelines for good scientific practice. Some data work providers make ethics part of their marketing. However, the adoption of more ethical practices often depends on the circumstances of the situation: Who provides the budgets, sets the research priorities and is responsible for implementation. Especially for academic clients, there is a lack of guidance on how to outsource data work responsibly and ethically.


This project has two objectives: On the one hand, we want to investigate how to create equitable work systems and environments for data workers that put the experience and expertise of workers at the center. On the other hand, we want to use these findings to develop ethical guidelines for researchers.

Duration: May - December 2024

Cooperation Partners: Caroline Sinders, design researcher; Krystal Kauffman, data worker, lead organizer at Turkopticon; Dylan Baker, research engineer at DAIR; Marc Pohl, UX/product designer and data work researcher

Funding: Weizenbaum Institute