The group investigates the “information ecosystem“ on search engines and social media platforms to understand how algorithmic systems are designed and how the audience perceives them. It analyzes the organization of information filtering algorithms and their relationship with disruptive influence such as Russian online propaganda. Moreover, it studies how platforms reconcile market-driven algorithmic organization with their role as de-facto gatekeepers of information and the social responsibility this entails.
The work of the research group is structured around three elements of the information ecosystem: online content, platforms’ algorithms, and online audiences. To analyze these elements, the group employs a mixed-method research design that combines qualitative and quantitative content analysis, agent-based algorithmic auditing, and qualitative interviews.
News content is traditionally mediated through journalistic practice. In this process, journalists filter information. They classify, organize, and present it to the public in a concise and structured format. Hence, media organizations set the agenda and frame stories for their audiences. Today, apart from this traditional form of journalistic mediation we are also increasingly confronted with algorithmic mediation. On digital platforms such as Facebook and Twitter, content is not filtered by humans but by automated systems. Currently, the two forms of mediation are overlapping in a historically unprecedented way, creating new, decentralized forms of message creation and dissemination that are difficult to navigate. The group explores the mechanisms of content curation in the present information ecosystem and analyzes the relationship between news, propaganda, and algorithmically mediated content.
Digital platforms such as Google and Facebook filter, select, and present information using algorithms driven by Artificial Intelligence (AI). The design of these systems is poorly understood. Private technology firms tend to avoid disclosure of the decisions that underpin their prioritization and filtering mechanisms, making scientific investigation particularly difficult for independent researchers. However, digital platforms have profound power over distribution of information and disinformation in societies around the world, while their algorithms have an imperfect and often ad-hoc design that contributes to political polarization, misinformation, and filter bubbles. The group explores the design of algorithmic systems in relation to filtering or blocking propaganda content. At the same time, the group looks at the normative underpinnings for platforms’ algorithmic decisions on selecting or blocking information that by now have the social responsibility as de-facto media organizations.
Search engine and social media users are the key element of the ‘information ecosystem’ because they actively take part in dissemination of information as well as disinformation. As private tech platforms are profit-oriented firms, they design algorithmic systems in a way that allows them to manipulate audiences in order to attract advertisers. This creates vulnerabilities in the information ecosystem because various political actors such as the Russian government use tools that originate in digital marketing—for instance bots, trolls, or clickbaits—to target specific social communities. The group analyzes target audiences of Russian propaganda on social media to understand what discourses attract them and why. At the same time, the group studies the level of awareness of algorithmic information mediation in these audiences.
Research Group Lead
Research Group Coordinator