This week (July 1-July 5, 2024) I facilitated with Richard Rogers a project on coordinated inauthentic behavior on Facebook at the Digital Methods Summer School and Data Sprint 2024.
In recent years, an increasing number of studies have focused on analyzing coordinated communication networks across social media platforms. These networks use coordination among various social media accounts to influence and manipulate the user base and the platforms themselves (Chan, 2022). This phenomenon is linked with several problematic activities, including the propagation of disinformation (Giglietto et al., 2020) and more extensive state-backed information operations.
Coordinated behaviour can be undertaken by both covert and less covert actors with varying objectives, ranging from mimicking organic engagement and support for content, to distributing content across social media pages through multi-actor broadcasting and re-posting, for example. In this project, we aim to perform an analysis of the diverse array of coordination forms, highlighting the analytical ambiguity of approaches that measure inauthenticity through the use of platform research affordances such as (timed) link-sharing and reposting.
Coordinated inauthentic behaviour campaigns on social media are driven by actors (and perhaps bots) pushing the same or related content in synchrony and causing it to gain virality, or some threshold of interactions and impressions to indicate a degree of popularity. The primary purpose is ambient. It is to ‘flood the space’, thereby exerting or appearing to exert a large measure of influence. The broader aim, for state and other political actors, could well be to develop a full-fledged counter-program to accrue symbolic power and assert political dominance (McIntyre 2018).
Research into coordinated inauthentic behaviour has demonstrated its operational as well as geographical breadth but also its platform dependence and orientation toward single platform studies (Thiele et al. 2023). It has been tied to corona-politics (Magelinski & Carley 2020), election misinformation (Nizzoli et al. 2021), protest repression in authoritarian regimes (Kulichkina et al. 2024), and cryptocurrency manipulation (Terenzi 2023). It is far-flung geographically but rather platform-specific in its targeting. Research has described coordinated networks on social media in Australia (Graham et al., 2021), Nigeria (Giglietto et al. 2022), South Korea (Keller et al., 2020), Philippine (Yu 2022), Brazil and France (Gruzd et al. 2022).
The algorithmic architecture of each platforms suggests coordination efforts around certain digital objects. For example, coordination on Twitter typically aims to push a hashtag into the trending topics, occasionally with ‘weaponised bots’ (Graham et al. 2021). On Facebook, website URLs are typically placed in posts on Pages and Groups, where the idea is to elicit emotive reactions, long comment threads and further sharing, which is the platform formula for algorithmic amplification (Merrill & Oremus 2021). The Facebook Feed is thereby persuaded to elevate these shared links now charged with emotive currency.
Through a thick description of a set of international coordinated networks on Facebook, this project advances the discourse on coordinated inauthentic behaviour on social media platforms by extending its study beyond influence operations. Based on techniques developed for its technical analysis – namely CooRnet (Giglietto, Righetti, Rossi, 2020) – we undertake an empirical study on Facebook that surfaces not only such operations but also activist networking, viral marketing, fan support, analytics-driven publishing and others in the service of mobilising attention.
Coordinated Inauthentic Behaviour on Facebook, Richard Rogers (UvA), Nicola Righetti (Univ Urbino Carlo Bo), Digital Methods Summer School and Data Sprint 2024