Analyzing Far-Right Telegram Channels as Constituents of Information Autocracy in Russia
This study examines how Russian far-right communities on Telegram shape perceptions of political figures through memes and visual narratives. Far from passive spectators, these actors co-produce propaganda, blending state-aligned messages with their own extremist framings. In Russia, such groups are central because they articulate the ideological foundations of the war against Ukraine and reflect the regime’s gradual drift toward ultranationalist rhetoric. Drawing on a dataset of 200,000 images from expert-selected far-right Telegram channels, the study employs computer vision and unsupervised clustering to identify memes featuring Russian (Putin, Shoigu) and foreign politicians (Zelensky, Biden, Trump) and to reveal recurrent visual patterns in their representation. By leveraging the large-scale and temporal depth of this dataset, the analysis uncovers differential patterns of legitimation and delegitimation across actors and over time. These insights are not attainable in smaller-scale studies. Preliminary findings show that far-right memes function as instruments of propaganda co-production. These communities do not simply echo official messages but generate bottom-up narratives of legitimation and delegitimation that align with state ideology. By framing leaders as heroic and opponents as corrupt or weak, far-right actors act as informal co-creators of authoritarian legitimacy within Russia’s informational autocracy.
💡 Research Summary
This paper investigates how Russian far‑right communities on Telegram use visual memes to co‑produce political legitimacy within an “informational autocracy.” Drawing on a novel corpus of roughly 200,000 images collected from public Russian‑language Telegram channels between 2022 and 2025, the authors apply a multi‑stage computer‑vision pipeline (face detection with MTCNN, celebrity recognition via Amazon Rekognition, custom reference collections for less‑well‑represented figures) to automatically label images featuring Vladimir Putin, Sergei Shoigu, Volodymyr Zelensky, Joe Biden and Donald Trump. Manual validation confirms high labeling accuracy (>90 %) and duplicate removal ensures a clean dataset.
The visual content is then encoded with a ResNet‑50 backbone, reduced with t‑SNE/UMAP, and clustered using density‑based algorithms (DBSCAN, HDBSCAN). Parallel textual analysis of post captions employs Latent Dirichlet Allocation to extract dominant topics. Three overarching framing clusters emerge: (1) heroic‑leadership and sacrifice (predominantly Putin and Shoigu, characterized by military uniforms, red palettes, and triumphant poses); (2) external threat and victimhood (depicting the West or Ukraine as aggressors, often using dark tones and destruction imagery); and (3) corruption/weakness (targeting foreign leaders with symbols of greed, decay, or incompetence).
Temporal dynamics reveal spikes in heroic framing during the early phase of the 2022 “special operation” and sustained high levels through 2025, while delegitimizing frames against Zelensky, Biden and Trump peak during periods of intensified combat and around the 2023‑2024 election cycles. Network analysis of channel interactions shows a core‑periphery structure: a handful of high‑traffic channels (@war_memes, @karga4, etc.) act as “frame producers,” while numerous smaller channels function as “frame disseminators.” This pattern demonstrates how semi‑autonomous actors amplify state‑aligned narratives without direct governmental control, embodying a bottom‑up co‑production of propaganda.
The study tests three hypotheses: (H1) far‑right Telegram channels reproduce key legitimacy narratives promoted by the Russian state (heroic leadership, victimhood, external threat); (H2) they legitimize domestic leaders while delegitimizing foreign actors; and (H3) visual‑textual elements align with established authoritarian legitimation clusters. Empirical findings largely confirm all three hypotheses, showing that meme production mirrors official discourse but adds a layer of participatory authenticity and humor that enhances resonance among target audiences.
Methodologically, the research contributes by leveraging large‑scale multimodal data and unsupervised learning to uncover visual motifs that would be invisible to text‑only analyses. Limitations include potential bias in face‑recognition models, the subjectivity of channel selection, and the challenge of fully interpreting meme semantics without extensive human coding. The authors suggest future work extending to other platforms (e.g., VK, YouTube), incorporating multilingual analysis, and integrating qualitative meme‑interpretation to deepen understanding of how visual humor functions as a tool of authoritarian legitimation.
In sum, the paper demonstrates that far‑right Telegram memes are not mere echo chambers of state propaganda; they are active, decentralized co‑creators of legitimacy, reshaping the informational landscape of contemporary Russian authoritarianism.
Comments & Academic Discussion
Loading comments...
Leave a Comment