de / en

Within and Without: Analysing Offshored Content Moderation to India

Screening user generated content at large scale is a commercialised practice which is often required by large social media platforms and e-commerce sites. This moderation is a multi-tiered process, exacting the digital labour of contracted workers in offshored locations of the Global South and is overseen by a small section of clients’ policy teams which are often based in the Global North. Weizenbaum researcher Sana Ahmad shares insights into her work on the social media content moderation industry and the offshored business processes in India.

“When I was working on live streaming, we could escalate the videos to the California office who would then notify the local intelligence within 24 hours with all details, considering there was a serious danger (such as suicide attempt) to the person uploading the video or to someone else. I have also seen videos where people are just sitting and talking and then killed themselves. What can I say? It is my job to safeguard the societies through these social media platforms.”

- An interview excerpt from a contracted content moderator in India.

Much of the discussion regarding social media has increasingly pivoted towards content moderation.  Not surprisingly though, considering the contested role of social media in affecting democratic election outcomes, allowing space for right wing movements as well as enabling disinformation. International scrutiny on how social media platforms moderate the user generated content has found great emphasis in legal regulations, notably Germany’s NetzDG Network Enforcement Law, India’s draft bill on intermediary liability, relocation of distribution networks such as Telegram (messaging app) in Iran, Indonesia’s pressure on Chinese social media Bytedance to set up content moderation units in the country, and others. Freedom of expression on one hand and its antithesis in curtailment of speech for national or commercial interests on the other, have been the two dominant positions within which analysis of content moderation has been described. The production model of content moderation and the lurking labour industry behind it however, have remained obfuscated.

The Platform Metaphor

There are several reasons that come to mind. But one that really stands out is the lack of clarity on what a platform really is and how it is supposed to function. Last year’s court hearing in the United States saw Facebook defending its right to retain or delete certain content on the platform within its rights as a ‘publication’. This is an interesting turn of events, considering that Mark Zuckerberg from the onset has defined the product as Facebook platform. For what is worth, this is not just a definitional fallacy that can be extrapolated of Facebook. In a recent article, ‘The Platform Excuse is Dying’ the author observes, “If the concept of a platform sounds confused, that’s actually the power of the metaphor”. The ‘platform’ metaphor which has since mid-2000s been associated with social networking sites, has allowed social media companies to tip-toe the definitional categories and in the meanwhile, enormously scale-up and generate value from collection, monitoring and monetizing of user-generated data.

Content moderation is one of those functional layers of social media platforms, which seems more nebulous once the labour question is touched upon. In my research on this subject, I have been engaged with answering questions on the production model of content moderation, the offshored work processes and the labour conditions of contracted moderators in India. And these are extremely difficult questions to answer. It is not just the conceptual ambiguity but also the methodological uncertainties that permeate the grounds on which new global divisions of labour can be studied. Right from gaining access to the field, to securing interviews to even being subject to gendered behaviours and biases from some interviewees; it has been challenging to find new patterns of business practices in content moderation. Where there is some literature available on the subject, it has mostly been concentrated in a few regions of the Global North.

Offshoring Content Moderation to India

India has since long maintained its position in the global offshored services market and the sector of information and communications technology offshored services in the country is steadily adapting to demands for new expertise. This has also come to include content moderation, which is a back-end, non-voice business process and it can be situated within the operations department of the Information Technology Business Process Outsourcing offices. Workers, bound by quality and quantity targets, look at the flagged social media user generated content such as text, images, gifs, and videos and make decisions to allow or delete the content, according to guidelines provided by social media clients. Considering the large process scale, these decisions are often to be made within a few seconds and require adept knowledge of manuals and guidelines, very high concentration as well as distantness to the sensitive and often psychologically-distressing content.

The three and half months of steady fieldwork in India did not just result in 38 hour-long interviews with a diverse range of participants including content moderators, contracting companies, trade unions, civil society organisation and local social media companies, but also refined the lens through which I had expected to study content moderation on the field. For starters, content moderation is a rarely-used business terminology in India, a learning which I was constantly confronted with while trying to access the interviewees, many who were unable to grasp the meaning of this term. Added to this was the dilemma of addressing content moderation either as a novel type of work or an archetype of call centre service industry with advanced analytics.

Towards a Starting Point

What resulted was a series of multi-layered examinations of ICT business models, disclosure on the field and discourses on content moderation. This is a long and arduous process and will consume my foreseeable future in analysing why content moderation is an expanding business process in the service industry in India, what kind of skills are required for moderating these platforms and if they are replaceable by automated technologies. I may not be able to produce definitive answers but at least contribute towards a starting point.

Sana Ahmad is a PhD Student in Research Group 1 "Work in Highly Automated, Digital-Hybrid Processes" at the Weizenbaum Insitute.