First Monday

Navalny’s direct-casting: Affective attunement and polarization in the online community of the most vocal Russian opposition politician by Aidar Zinnatullin



Abstract
Social media has a significant impact on the process of political polarization. Despite a large body of research on polarization and social media in democracies, studying this relationship in autocracies remains a niche field. This paper describes the content, composition, and behavioral patterns of discussions that take place on YouTube in the community of Russia’s most vocal opposition politician, Alexei Navalny. Based on a corpus of more than eight million comments, this study provides empirical evidence on the relatively short-term nature of affective attunement induced by a leader promising social changes within an authoritarian context. This discovery is manifested in the observation that periods of high public interest in Navalny’s activities are marked by a significant influx of new audience members into his community. However, the retention rate of this cohort of users was lower than that of the cohort of commenters, who started discussions during periods of lower public interest in Navalny’s activities. This conclusion applies not only to the entire set of commenters, but also to pro-government and anti-government users. According to the exploratory text analysis, the most common topics in discussions were praising Navalny’s activities, criticizing the government, and enticing people to share videos to change the minds of apolitical citizens or pro-government supporters. Finally, one of the affective polarization parameters, the degree of toxicity of discussions, is higher on Navalny’s community than on an apolitical celebrity’s YouTube channel, which establishes a baseline for the level of incivility.

Contents

Introduction
Theoretical background
Data and methods
Results
Discussion and conclusion

 


 

Introduction

“Architecture of participation” in the Web 2.0 era (O’Reilly, 2005) centered on interactions of users with each other, significantly changed not only the media environment but other public spheres, including politics. In political communication, technological innovations began new practices of media consumption which lead to a new form of polarization — affective (Iyengar, et al., 2012; Iyengar and Westwood, 2015) with a profound effect on political processes for various types of regimes, including authoritarian states (Nugent, 2020; Enikolopov, et al., 2020, 2018). This study describes the dynamics of political discussions in the community of Alexei Navalny on YouTube, the key medium for promoting the agenda of Russia’s most vocal opposition politician, through the prism of theories of affective publics and affective polarization.

I focus on the extent to which Navalny’s activities lead to the formation of a community of users who engage with Navalny’s content in the long run. Social media facilitates the encounter of people with opposing viewpoints, exacerbating polarization via the process of sorting when major divisions in society become all-encompassing (Törnberg, 2022). Therefore, it is necessary to understand how different users — those who support Navalny and those who oppose him — interact within a community formed around the leader of the opposition.

Affective polarization is associated with lower expectations about public deliberation which happens due to the prevalence of incivility, hate speech, and other forms of identity attacks in online public debates (Hwang, et al., 2014; Harel, et al., 2020). Therefore, my main research question is formulated as follows: What are the characteristics of political discussions in the YouTube community of Alexei Navalny from the perspective of (1) the content; (2) its degree of incivility; (3) the composition of the participants; and, (4) the longevity of their interactions with other community members?

The relevance of such a descriptive study on the nature of communication under authoritarian rule lies in the following two features of the political regime. First, contemporary authoritarianism is predominantly informational (Guriev and Treisman, 2022) which means that violence, as the main means of maintaining power, has been replaced by autocrats’ work with public opinion and electoral procedures. The image of an effective ruler who can provide citizens with an acceptable quality of life is an important factor in autocrats’ legitimacy. This study covers the period from 2013 to 2021, when the political regime established in Russia could be characterized precisely by this definition of informational autocracy. Second, mass protest collective actions are a critical juncture for contemporary autocrats because the risk of losing power increases significantly at such moments. The mechanism of this process goes through the loss of confidence on the part of citizens and a possible split in the elites’ power configuration (Bratton and van de Walle, 1994). Therefore, autocrats eagerly seek ways to raise the costs of participation in collective action (Olson, 1971), including suppressing dissent, which becomes a key challenge for social and political change in autocracies. However, in the era of social media, independent activists have the opportunity to solve this problem through direct casting (Bastos, Raimundo, and Travitzki, 2013), which allows emotions against the government/ruling class to be created and accumulated as affective publics (Papacharissi, 2014) with the potential to generate communities, a crucial prerequisite for successful collective action. Alexei Navalny is an example of how these mechanisms of social media for generating communities can turn a niche activist conducting anti-corruption investigations into an opposition leader with the infrastructure for organizing a countrywide protest (establishing a substantial base for fundraising, the network of regional organizations, popular accounts on social media platforms, and investigative journalism structures to detect corruption among high-ranking officials). Alexei Navalny’s activities, including the media products his team releases, strain the autocratic regime in these two key dimensions: (1) they criticize a country’s leadership and (2) seek to bring people to the streets for rapid political change.

Empirically, I begin with a topical representation of discussions in the comment section on Navalny’s YouTube channel. Next, the identity of commenters was studied considering (1) contribution to the discussion by one-off and prolific commenters; and, (2) the periods of public interest in Navalny when one-off and prolific commenters came to his community. I then considered the dynamics of the outflow of commenters who first began to comment during a period of high interest in Navalny and when interest was below average. Such a retention analysis was carried out for the entire set of commenters and for those who, during their first entry into the discussion, demonstrated partisan (pro-government vs. opposition) cues. Finally, I studied toxicity in conversations and contrast it with a universe of comments from an apolitical celebrity channel to establish a baseline for the level of incivility.

The main takeaways from this descriptive research are as follows. First, the prevalent topics in discussions were related to praising Navalny’s activity, criticism of the government, and inducement of spreading videos to change the minds of apolitical citizens or pro-government supporters. Second, one-off commenters (i.e., those who wrote a comment only once and did not contribute to the discussion) appeared more often during periods of high interest in Navalny, while prolific commenters joined predominantly when public interest in the Russian opposition leader was below average. Although one-off commenters outnumbered those who commented more frequently, the latter contributed eight times more to the production of comments. Third, the cohort of commenters who first engaged with Alexei Navalny’s YouTube content during a period of high public interest in the politician’s activities was less likely to stay in the community for the next 15 months. Those who started to comment when there was no high public interest in Navalny, in the long run, lingered more in the community of Russia’s most vocal opposition politician. This observation applied not only to the entire set of commenters, but also to pro-government and anti-government users. Finally, the level of incivility in Navalny’s community was higher than that of an apolitical celebrity YouTube channel.

The remainder of this paper is organized as follows. I start with a theoretical background where I describe a framework to study Navalny’s community on YouTube. Here, I also present the role of Alexei Navalny in Russian politics in the period under study (2013–2021) and show how YouTube became the main independent platform where different segments of Russian politicized communities. In the Data and methods section, a description of the data sources and research strategy is presented. I conducted an empirical analysis of discussions, applying text-as-data approach techniques, cohort analysis, and the comparison of toxicity levels in conversations with an apolitical celebrity community. In the Discussion and conclusion section, I point out the main results of the study, its limitations, and perspectives for future research.

 

++++++++++

Theoretical background

Affective publics, affective attunement, and affective polarization in the non-democratic context

Digital technologies based on the interaction of users through the content they produce (the concept of Web 2.0; O’Reilly, 2005) have opened up new prospects for collective participation in politics. This applies not only to new forms of participation, specifically in the online domain (Spaiser, et al., 2017; Morales, 2019; Pan and Siegel, 2019; Miller, 2022), but also to more efficient coordination (Enikolopov, et al., 2020, 2018) and the overall lowering of the entry barrier into the lists of public opinion leaders or influencers whose opinions others begin to orient (Bastos, et al., 2013).

On the other hand, in new technological circumstances, collective actions have an ad hoc, or sporadic, nature, and basically create a “feeling of community” rather than the community itself [1]. Researchers identify such communities as affective publics (Dean, 2010) or affective networks (Papacharissi, 2014). Affect is not the emotion itself but its intensity (Papacharissi, 2015) [2]. One manifestation of this phenomenon can be observed in the process of connective effervescence when users write emotionally filled messages if they encounter situations that causes a sense of threat (Ventura, et al., 2021).

A stream of political science literature on affective polarization should not be avoided in discussions of users’ engagement with media content based on affect. As a result of the emotionally fueled rather than ideologically charged division of the world (us versus them), politics as a process is perceived as a zero-sum game (Levendusky, 2013). In such circumstances, social media platforms become a space for identity formation rather than an arena for deliberation (Törnberg and Uitermark, 2021). New digital technologies facilitate interaction with the other side (Barbera, et al., 2015; Bakshy, et al., 2015; Karlsen, et al., 2017; Bail, et al., 2018; Muddiman and Stroud, 2017; Vaccari and Valeriani, 2021), intensifying existing contradictions even further because they sort existing cleavages and deepen a sense of fundamental differences existing in society (Törnberg, 2022). Therefore, partisanship becomes more encompassing, but ideological positions do not lean towards a more extreme pole.

Since affective publics are based on disruptions of the dominant political narrative (Papacharissi, 2015), this concept is of great interest for the study of non-democracies, where people are still able to exchange their opinions but have restrictions to act collectively. In such circumstances, political activists who are not afraid of being vocal in criticizing a country’s leadership (Toepfl, 2020) can form communities initially bound by affective attunement and then organize more formally.

Understanding how social media at the disposal of independent politicians helps shape political discourse and frame collective actions in an authoritarian context is of critical importance. First, this is due to the media system of autocracies, where independent politicians and journalists are expelled from traditional media (primarily national TV) and are forced to look for alternatives. Consequently, they actively exploit social media platforms. Second, the nature of polarization in such regimes is different. It is rooted not in the difference in values that underlie the widespread “liberal-conservative” scale, but along the “power-opposition” dimension (Urman, 2019). For a better understanding of polarization and the nature of collective actions within an authoritarian context, the focus on social media should be made because pro-government agents are also present and visible there (Spaiser, et al., 2017; Sanovich, 2017; Sanovich, et al., 2018; Orttung and Nelson 2018; Stukal, et al., 2022; Sobolev, 2019). This is consistent with the idea of the Internet as a platform for information warfare to achieve Russia’s geopolitical goals, for which authorities introduce both restrictive mechanisms within the country and conduct offensive communication operations outside their borders (Zelenkauskaite, 2022).

Reflection on the phenomenon of affective polarization within authoritarian regimes helps illuminate its nature in a new way. Pro-government rhetoric, usually picked up by a myriad of dependent media outlets (Makhortykh, et al., 2022), including social media channels (Gunitsky, 2015), rests on a simplification of political reality. It provides additional opportunities to polarize society by blaming the West, and actors affiliated with it, for poor economic performance and policies of the government (Rozenas and Stukal, 2019; Aytaç, 2021) as well as other problems (Alrababa’h and Blaydes, 2020; Laebens and Öztürk, 2020). Independent politicians also use social media to create and shape the structure of their communities (Papacharissi, 2015). Communities such as affective publics may drive the funnel of polarization further, contributing to the spread of a more toxic environment. Therefore, research on the affective formation of communities occurring on social media contributes to a better understanding of polarization and political communication in general within the context of authoritarian regimes.

YouTube as a platform for dissent and Navalny’s role in Russian politics

Due to the state control of federal television, from which a significant number of Russians obtain political information1, YouTube became the main platform for journalists, activists, and politicians to produce and distribute independent content. Before the Russian full-scale invasion of Ukraine, the platform offered several ways to profit from content, in addition to native advertising integration. In combination with the availability of technical devices for high-quality video recording, commercialization opportunities significantly affected the quality of media content. Therefore, Russian YouTube is professional, outperforming other media in terms of user attention [3]. This mixture of political and technological factors creates counterpublics in opposition to the dominant discourse orchestrated in traditional media by the government. Such counterpublics cultivate their own discursive spaces to challenge the political and social status quo (Wagner, 2002).

On YouTube, Russian users have more opportunities to consume independent, politicized content than on other social media platforms and TV. This goes along with the conception of gatekeeping in media when outsider media actors cannot get access to traditional media (foremost, TV) and begin to practice neither broadcasting nor narrowcasting but direct casting on social media platforms [4]. A popular platform, VK.com (Poupin, 2021), suffers from the punishing practice of censorship that anti-extremist police forces use to gain career promotion after fabricating cases against those who criticize authorities, or only “like” comments with criticism [5]. Facebook and Twitter, as platforms where people discuss politics, may be relevant only for a narrow audience, predominantly from Moscow and Saint Petersburg [6] (on Twitter, Navalny’s blog was the most popular for a Russian audience [7]).

Anonymity when expressing political thought is strong in a non-democratic context (Jardine, 2016). YouTube, as a platform where users pay little attention to their profiles, provides relative anonymity and safety, which allows commenters some freedom to express their thoughts (Halpern and Gibbs, 2013). Moreover, the platform is not Russian in origin, which implies that the Russian government cannot access personal data of users. Thus, political discussions on Russian YouTube are vivid and video bloggers are free to publish political content.

Navalny benefited the most from such institutional arrangements of direct casting in comparison with other independent activists, becoming the most vocal opposition politician in Russia. His political career started in 2007–2008 with anti-corruption investigations in Russian state-owned corporations. Navalny practiced the “greenmail” tactic. According to it, an individual buys a small but sufficient share in a company, and then, using his right as a minority shareholder, requires the disclosure of documents on transactions and decisions of top management. In the case of refusal, an individual then can sue. “Greenmail” is a legal process, widely used in the corporate world. However, in Russia, where almost all large state-owned companies have become a source of enrichment for the ruling class, such activity immediately attracted the attention of authorities, making Navalny”s activities both economic and political [8].

The de facto leadership of Alexei Navalny in the Russian opposition camp has been reflected in several studies. During the 2011–2012 protests in major Russian cities, Navalny was one of the main organizers and vocal politicians in “a rather loose conglomerate of relatively small groups and public figures, with little experience and limited capacity for cooperation” [9]. To prevent a legitimacy crisis and enlarge the scope of activity, the leaders of the movement attempted to transform it into a more formalized and structured — “organizationally brokered collective action network” (Bennett and Segerberg, 2012). This new structure was called “the Coordination Council of the Opposition” and was formed via Internet elections in 2012. Without any significant achievements, this group ceased to exist in October 2013 (Toepfl, 2016). However, Navalny successfully exploited the legacy of the 2011–2012 protest movement for free and fair elections in the 2013 Moscow mayoral campaign (Smyth and Soboleva, 2016) and, subsequently, in the 2018 presidential campaign when the nationwide network of Navalny’s supporters was created (Dollbaum, et al., 2018).

Alexei Navalny was actively criticized for his nationalistic beliefs. It was a mix of ethnocultural understanding of the concept of “nation” with a strong pro-European orientation (Moen-Larsen, 2014; Laruelle, 2014). However, by 2020, he had become a leader of the entire oppositional movement, promoting nationwide strategies of political action —; anti-corruption rallies, tactical voting to deprive the ruling United Russia party of votes in local and regional elections (Turchenko and Golosov, 2020). Navalny’s popularity has increased incrementally, at least until his return to Russia in 2021 and almost immediate imprisonment [10].

Such growth of Navalny’s popularity could hardly have been possible without digital platforms such as YouTube, which witnessed a large growth in Russian users. YouTube has become one of the most widely used social media platforms in the country [11]. This presents a trustworthy medium for oppositional politicians to promote their content (Litvinenko, 2021). According to Medialogia, a Russian marketing research company [12], Navalny’s YouTube channel was among the most popular Russian channels on this platform (ranging from the fourteenth position to the seventh by each month in 2020) and the highest-rated channel specializing in Russian politics.

Navalny’s team produces original content, primarily exposing the corruption schemes the ruling elite exploit to provide themselves and their relatives with a luxurious lifestyle. This, in addition to the topic of limiting political participation and choice that authoritarian authorities practice, is a fruitful soil for the “us versus them” opposition and, accordingly, polarization based on the “power-opposition” dimension. Therefore, I formulated a research question regarding the effectiveness of such work in informing citizens.

RQ1: To what extent does Navalny’s produced content is associated with the formation of a community of users who engage with his YouTube channel in the long run?

The literature on political communication in authoritarian regimes shows how sensitive autocrats are to the potential mobilization of their opponents (King, et al., 2014, 2013; Roberts, 2018; Hobbs and Roberts, 2018; Miller, 2022). Most researchers focus on the reactions and strategies of pro-government agents. However, the extent to which the content aimed at mobilizing dissent resonates with potential supporters of the opposition in the long run is not clear. In the case of Russia, researchers record a high level of political apathy among the population (Dollbaum, et al., 2018). In response to this tendency, Navalny’s team promoted the idea of ending the monstrous level of inequality and injustice in the treatment of public goods through collective action (protests, voting for independent candidates in elections, and participating in election campaigns as observers on election days).

Sometimes, Navalny’s media content or events around him become resonant and have been covered by media outlets, including those controlled by the state (although for a long time, state-owned TV ignored Navalny). On Navalny’s YouTube channel, there are not only attacks on officials involved in illegal enrichment, the disclosure of their corruption schemes, and attacks on an autocratic leader but also calls for the mobilization of those who disagree with the current status quo. Such topics can increase public interest in Navalny and have the potential to create an affective attunement of those who are dissatisfied with what is happening in the country.

Most of those who encounter Navalny’s content for the first time during such periods of high public interest in his personality may not have followed politics closely before and may demonstrate behavior resembling connective effervescence (Ventura, et al., 2021). In other words, these commenters can be angry about what is happening and extensively post comments about it, but in the long run, this will not lead to a closer connection to Navalny, such as the formation of parasocial relationships (Rubin, et al., 1985; Tsiotsou, 2015) when users actively interact with video content and regularly comment on it (Rihl and Wegener, 2017). Their interaction with Navalny’s community can be episodic. Therefore, the following hypothesis is proposed:

Hypothesis 1A: Engaging with Navalny’s content when interest in him is high is less likely to form a long-term relationship with his community on YouTube compared to a period of low interest in Navalny.

Commenters participating in discussions on Navalny’s YouTube channel represent a conglomerate of different user clusters. It can relate to (1) practices of interacting with video content (ranging from watching videos without active commenting to being a prolific commenter); and, (2) political views (from adherents of opposition views to government supporters). The behavior of the latter group — those who criticize Navalny and, thereby, take the ruling elite out of the criticism — can theoretically be justified, on the one hand, by astroturfing campaigns initiated by government actors (Sanovich, 2017; Sanovich, Stukal, and Tucker, 2018; Stukal, et al., 2022). On the other hand, pro-government positions in discussions can be seen as a manifestation of the sorting underlying affective polarization. According to this line of literature (Barbera, et al., 2015; Bakshy, et al., 2015; Karlsen, et al., 2017; Bail, et al., 2018; Vaccari and Valeriani, 2021), social media facilitates the interaction of users with opposing political positions throwing them into a political war (Törnberg, 2022). This theoretical reasoning undermines an idea about the encompassing nature of echo chambers, filter bubbles, and selective exposure (Stroud, 2007; Pariser, 2011; Guo, et al., 2018; Cinelli, et al., 2021). People on social media and the Internet generally have opportunities to encounter opinions that differ from their own beliefs. Faced with the other side, the convergence of individual positions does not occur, but the thought of deep differences with the other side takes root. Therefore, polarization becomes encompassing.

One of the most remarkable lines of extant literature in the domain of parasocial relationships argues that this kind of relationship can also be negative (Dibble and Rosaen, 2011). In other words, those who dislike Navalny can subscribe to his channel, and parasocial relationships may also appear in this situation. Therefore, discussions may not contain only pro-Navalny (and anti-government) rhetoric. Given my interest in the phenomenon of affective attunement, I expect that anti-opposition rhetoric will follow a pattern similar to Hypothesis 1. When interest in Navalny’s personality increases, there will also be relative interest from those ready to express opinions criticizing Russia’s most vocal opposition politician. However, this interaction will be sporadic, unlike the periods of less public interest in Navalny when users with more interest in politics and established pro-government views can join the discussions and periodically show up there. Thus, the following hypothesis was formulated:

Hypothesis 1B: Those who start engaging in discussions expressing anti-opposition cues during a period of high interest in Navalny’s personality are less likely to form a long-term relationship with the politician’s community, unlike those who start engaging with Navalny’s content in a less contentious period.

Following the logic described in Hypotheses 1A and 1B, I expect that those who express criticism of the government and enter into discussions at a time of increasing interest in Navalny will be less likely to linger on Navalny’s channel in the long run. On the contrary, commenters who began interacting with Navalny’s content during periods of less interest in his personality will form a community of active users to a greater extent. It is likely that the interest in Navalny among the latter group of commenters is generated by more serious and deep concerns about political processes evolving in Russia, and not by sporadic or hype interest. Thus, I proposed the following hypothesis:

Hypothesis 1C: Those who start engaging in discussions using anti-government cues during a period of high interest in Navalny are less likely to form a long-term relationship with the politician’s community, unlike those who start engaging with Navalny’s content in a less contentious period.

Recent reports on the quality of democratic governance show that there is a tendency for the erosion of democratic institutions worldwide (Repucci and Slipowitz, 2022; Alizada, et al., 2022). Illiberal leaders come to power in democratic countries and undermine established democratic order. Consequently, society polarizes and political discussion becomes more toxic. The authors of the Varieties of democracy report even define such polarization as “toxic” [13]. In this context, incivility, hate speech, and other forms of identity attack in public discussions online are associated with lower expectations about public deliberation and are mediated by the perception of a polarization of society (Hwang, et al., 2014). According to observations by Davis (2021), anti-public discourse (which is precisely characterized by hate speech, cyber racism, undermining democratic foundations, and the authority of science) ceased to be an aberrant phenomenon and reformat the concept of publicity, quickly mastering the tools of social media.

Toxic discussions are one of the traits of affective polarization (Harel, et al., 2020). Against this background, the sentiments of conversations in the community of most vocal opposition politicians acting within an authoritarian context are of interest. Thus, I formulated a research question about the tone of discussions in Navalny’s community on YouTube.

RQ 2: To what extent are toxic discussions taking place on Navalny’s YouTube channel?

 

++++++++++

Data and methods

Description of data sources

Comments were collected from videos uploaded to the YouTube channel of Alexei Navalny between 2013 and July 2021. The data collection process, based on the YouTube Data Tools service (Rieder, 2015), lasted from November 2020 to July 2021 and covered events such as the poisoning of Navalny and his arrest in January 2021 when he returned to Russia after medical treatment in Berlin. The dataset includes 8,980,313 comments from 407 videos [14]. Information on the period of account registration was obtained through queries sent to the official YouTube API.

Table 1 presents a summary of statistics for videos, comments, and commenters. By top-level comments with a thread, I mean top-level comments that open a discussion. When other commenters post messages under a top-level comment, threaded comments appear. I also discern top-level comments without a thread (they do not open any discussions). This division of comments is relevant only for one type of further analysis: the toxicity of comments. It will allow me to capture the baseline differences between non-political and political channels on the Russian YouTube. All the details related to this analysis are presented in a separate subsection.

 

Table 1: Summary statistics of comments on Navalny’s YouTube channel.
Note: * — For the dataset without a video about Putin’s Palace, because API returns an error for the structure of a comments thread (no distinction into top-level and thread comment).
Data itemNumber of observations
Comments8,980,313 (7,985,548* for some tests)
Commenters1,858,544 (1,570,657* for some tests)
Top-level comments without a thread*5,387,019
Top-level comments with a thread*579,556
Thread comments*2,018,973
Videos407

 

Structural topic modeling of comments

To present a general overview of topics raised by commenters in their posts, I started my empirical analysis with structural topic modeling (STM) of discussions in the comments section of Navalny’s YouTube channel. This method allows an analyst to explore (1) how word co-occurrences (or topics) are distributed across documents (YouTube comments in this case); (2) the correlation of the topics with some covariates of research interest; and, (3) the prevalence of topics at different levels of a covariate. I included one covariate in my structural topic model: a type of comment (whether it is a top-level comment (1) with or (2) without a thread or (3) a thread comment).

To create a representative sample that would be tractable with available computing capacities, I extracted a 10 percent random sample clustered on video level and comment type. This resulted in 700,242 comments, which decreased to 695,405 comments after pre-processing (lemmatization, deleting stopwords, punctuation, and removal of numbers and words with fewer than three characters). Words that appeared in less than one percent of the comments were also excluded. Although the extant literature suggests removing words that are common for more than 99 percent of documents in the corpus (Hopkins and King, 2010; Grimmer and Stewart, 2013; Maier, et al., 2020), I did not do so because otherwise, the commonalities in topics are almost not discernable.

Choosing the right number of topics, or k, is crucial in topic modeling as it directly impacts the quality and interpretability of the results. Statistically driven ways to identify the number of topics in topic modeling augment the close reading of representative documents by researchers and provide some orientation for human interpretation (Grimmer and Stewart, 2013; Grimmer, et al., 2022). Therefore, two methods were used to select the number of topics. Briefly, these methods are different approaches to determining the appropriate number of topics in topic modeling. They focus on various statistical measures and algorithms to analyze the topics’ characteristics and their relationships to find the most suitable number of topics for a given text corpus. The first method is based on spectral initialization (Mimno and Lee, 2014), which defined 61 topics as an orientation for further modeling. The second method relies on metrics such as Arun2010, CaoJuan2009, Deveaud2014, and Griffiths2004 from the R package LDAtuning (Murzintcev and Chaney, 2020) [15]. To employ this approach, I ran the LDA tuning algorithm for an assumed number of topics ranging from 2 to 100. I found the extremum for the corresponding pairs: minimization (Arun2010 and CaoJuan2009) and maximization (Deveaud2014 and Griffiths2004). The results are presented in Figure B1 in the Appendix B. The optimal number of topics ranged between 60 and 70. Thus, I used 61 topics derived after the use of spectral initialization (Mimno and Lee, 2014) because this number of topics corresponds to the range identified by LDAtuning metrics.

Composition of commenters and retention analysis

The question of how much and when Navalny’s YouTube channel creates not just a “feeling of community” (Dean, 2010), but a community itself, should first be considered from the standpoint of how often users enter into a discussion. As reported in the literature, clustering users by types of activities provides a more granular understanding of online participation (Zelenkauskaite and Balduccini, 2017). Following this notion, I segment YouTube commenters into two groups: one-offs (i.e., those who write a comment only once and do not contribute to further discussion) and prolific commenters (those who comment more than once). Such a threshold (whether a commenter wrote more than one comment in his history of interaction with Navalny’s content) reveals, albeit partially, the nature of online participation in the community: to what extent is it sporadic or, conversely, systemic?

To address Research Question 1 and test related hypotheses 1A, 1B, and 1C, I used cohort analysis. Specifically, I compared the behavioral patterns of commenters who started to engage with Navalny’s content on YouTube during different periods of public interest. Interest in Navalny and his activities was measured using Wikipedia data on views of the Russian-language page dedicated to the politician. The time of increased interest includes periods when Navalny’s page was visited above the average monthly value, equal to 147,549 (See Figure 1). I present the aggregate number for all the agents who visited the page, not just ordinary users. These agents may include bots, crawlers, and third-party services. Appendix A also provides a similar graph for visits to Navalny’s page by real users (average monthly visits are 142,865; see Figure A1) and interest in Navalny according to Google Trends (see Figure A2). Both graphs repeat the trends reflected in the chart for all agents (Figure 1).

I chose the average monthly view instead of the median for the following reasons. For the average values (or the mean), well-known concerns exist regarding the sensitivity of the metric to outliers. However, I am interested in the phenomenon of affective attunement when emotions start to go off the scale, and many people have a real interest in who Navalny is and what he does. Therefore, the higher bar given by the average (147,549 versus 74,231, which is the median monthly view) helps to cut off many periods when the hype around Navalny was not sharp or did not exist at all. During periods of high interest in Navalny, 101 videos were published, whereas 306 videos were published on ordinary days.

 

Interest in Navalny according to visits to his Wikipedia page in Russian, all agents
 
Figure 1: Interest in Navalny according to visits to his Wikipedia page in Russian, all agents.

 

After detecting the periods when interest in Navalny was above average, users were segmented into those who commented on a video at the time of a spike of interest in Navalny’s activities and those who began to discuss regular days. Returning to Research Question 1, I recall that my interest lies in the formation of a community around Navalny’s YouTube channel. YouTube’s API, expectedly, does not provide data on which users are watching Navalny’s videos or when they first started doing so. By community, I mean a narrower layer of users — those who comment on the video — because the interaction of a particular user with the video is recorded precisely by posting a comment. That said, I acknowledge that the community may also include those who regularly watch videos produced by Navalny’s team but choose not to engage in discussions in the comments section. However, the lack of relevant data dictates the need to slice my attention’s scope to commenters.

For the cohorts, I selected only commenters who first began to interact with the video within a week of the video release. Videos, in turn, were grouped separately into those that were published during periods of high and low interest in Navalny. I took a period within a week after the video release rather than the exact day of publication because in this case, I could capture the effect of the public discussion of Navalny’s investigations or events around him. In other words, people who were not familiar with Navalny’s activities could come to his channel to become acquainted with the video, not immediately after its release, but with a time lag.

The next step in the implementation of cohort analysis is to determine the temporal period for which I evaluate the retention of commenters on Alexei Navalny’s YouTube channel. I limited the period to 15 months. This period expresses long-term interest in the content produced by Navalny’s team. Retention was reviewed monthly by a commenter after the first entry into the discussion. I look at the period when the comment was left and not the sequence in which the video was released on Navalny’s channel. For instance, those who commented on the release of a film about former Russian prime minister Dmitry Medvedev (March 2017) can leave other comments the following month (April 2017) but for a video that came out earlier (February 2017). In this way, I assume that a community is formed when users return to old records and do not simply follow new releases of anti-corruption investigations.

In Appendix D, I also replicate the retention analysis with a focus on 15 months, based on the event of commenting itself and not followed consecutively in calendar order. For the sake of clarification, cohort members may become silent the month after a video is released because no videos were released that month. However, they may return later when activity on the channel, in terms of releasing new investigations, resumes. When we observe such a situation, earlier periods of Navalny’s presence on YouTube contain more uncertainty (as shown in Figures 47). Therefore, this factor of interruption in the production of content by the Navalny team must also be considered.

To assess the differences between groups of users involved in discussions at different periods of interest in Navalny, I used 95 percent confidence intervals obtained using bootstrap with replacement. Owing to the relatively small number of videos from each of the considered periods, 100,000 samples were used.

To test Hypotheses 1B and 1C, I exploit a keyword-based approach to focus on users who start their engagement with Navalny’s YouTube channel using pro-government and pro-opposition cues. I checked whether a comment contained any words with derogatory references to the government or opposition. This procedure does not imply counting the overall number of words in a comment, which is necessary for scaling commenters’ positions between two extremes in the “government — opposition” continuum. Instead, I detected the presence of pro-government or pro-opposition cues because such a focus on insults towards the other side is dictated by fitting to the idea of affective polarization driven by emotions rather than policy preferences. This task is accompanied by the challenge of correctly choosing words that capture as many instances of politically colored speech as possible. To address this issue, I followed the approach of iterated computer-assisted keyword selection suggested by King, et al. (2017). First, I began with several derogatory words targeting Putin and Navalny. This step cuts out many shades of disagreement (especially those made in a less uncivil manner), but its advantage is that I can downsize the risk of detecting false positives. Derogatory references to Putin and Navalny were detected after a close reading of YouTube comments and pages on Lurkmore, which serves as an encyclopedia of political discourse on the Russian Internet (Table E1 in Appendix E). Then, I widen the scope of keywords in a snowball sampling manner, checking the sentiment of comments towards both the government and the opposition, either randomly reading some of them (when their number is huge) or the whole subset of comments with a particular word.

It is hardly possible to be completely sure of the correctness of this classification into oppositional and pro-government categories relying only on keywords. For instance, commenters may use their opponents’ cues as a tongue-in-cheek trick to mock the opposite side. In addition, although I checked as many distortions and obfuscations as possible, the list of derogatory references is not all-encompassing because commenters are very creative in name-calling and intentional distortion of words that have been included in the dictionary. Simply reading individual comments chosen randomly from the entire corpus does not reveal all possible obfuscations and intentional changes in major name-calling. Thus, there is inevitably some level of uncertainty in the classification of comments as pro-government or pro-opposition. To address this issue, I follow the same procedure of bootstrapping described earlier, but with a focus on those who commented on Navalny’s YouTube channel using pro-government and pro-opposition cues. Bootstrapping allows me to add this uncertainty to the estimates with relevant confidence intervals.

Comments containing keywords from both dictionaries — pro-government and oppositional — were excluded from the corpus for cohort analysis. Further details on the distinction between pro-government and oppositional discourses can be found in Appendix E.

Analysis of comment toxicity

To address Research Question 2, I used the toxicity scores provided by Google Perspective API [16]. Developers define toxicity as “a rude, disrespectful, or unreasonable comment that is likely to make people leave a discussion” [17]. Perspective API detects incivility in short text written in several languages, including Russian [18]. The service was used to detect toxicity in short comments in Russian-language online communities and services (Bogoradnikova, et al., 2021) as a baseline for comparison, and outperformed many other methods checked by the authors.

The service offers several toxicity attributes (identity attack, insult, profanity, threat, etc.). I did not replace toxicity scores with other Perspective API attributes because the correlation between them in different comment samples was strong enough (from 0.8 and higher). Thus, I worked only with the flagship attribute of Perspective API, a general measure of toxicity. Table F1 in Appendix F contains examples of comments, their translations in English, and toxicity scores.

The estimate of toxicity in Navalny’s YouTube channel comment section, without comparison, does not substantively tell us anything about incivility. Therefore, it is necessary to find a community comparable to Navalny’s YouTube channel in terms of audience characteristics (Navalny’s team can gather different segments of Russian society but predominantly a younger audience). The Russian analog of The Late Show with Stephen Colbert called Evening Urgant (the number of subscribers on YouTube is 6.36 million, almost like Navalny’s channel — 6.43 million by winter 2022) was chosen. However, I eventually found that Urgant’s YouTube channel was no longer available. After the start of Russia’s full-scale invasion of Ukraine, the YouTube administration removed it because it was associated with Channel One, a TV channel sponsored by the Russian government. Therefore, the content of the YouTube channel is not available, but the information from the YouTube API can still be retrieved (general statistics on the channel, such as views, number of comments, likes, and comments). Since I started data collection on an apolitical celebrity channel in 2022 (before February 24th) and the channel was banned, the discussions that occurred in this community do not reflect the latest events related to the Russian-Ukrainian conflict. Otherwise, the level of toxicity could have been much higher.

 

++++++++++

Results

Topics in YouTube discussions

The first research question that frames this article is to what extent Navalny’s content led to the formation of a community of users engaging with his content through comments over the long run. To answer this, I start with a description of discussions evolving in the comments section on Navalny’s YouTube channel. In Table 2, I summarize the most frequent words translated into English for the 10 topics with the highest share in the corpus. The summary of the topics was labeled after reading the representative comments. Table B1 and Figure B2 contain the list of all topics with the seven most statistically frequent words in English and Russian.

Except for expected topics related to praising Navalny in different forms [19] and sending curses to the political elite [20], users consider commenting as a way to participate in politics online (Topics 2, 3, 12, 51, see Table 2), in addition to more traditional off-line modes to express discontent with the current state of affairs [21]. Overall, messages appealing to others to widely share information from Navalny’s videos constitute more than 10 percent of comments.

As can be seen from the STM results, commenters consider promoting videos on YouTube (in particular, through the list of trending videos [22]) as a form of informing others about how the existing political regime contradicts their interests and why to take an active part in collective actions. Below, a translation of several comments containing the highest score for Topic 2 (Discussion a video) is presented.

 

Table 2: Summary of STM with k (number of topics)=61 for comments, top 10 topics.
TopicSummary of topicSeven top wordsShare
5Freedom for NavalnyNavalny, freedom, free, release, political, reality, estimate5%
3Promotion of videotop, keep, hooray, fire, go up, let’s go, maintaining4%
16Corruption among elitesPutin, stay (in jail), thieve, officiary, jail, regular, member of parliament3%
12Trending the videotrend, comment, show, raise, bring (to the top), send, promote3%
41Save Navalnyhelp, fear, hope, God, family, open, idiot2%
11Revolutionpeople, revolution, endure, come, enemy, organize, change2%
51Convince otherstell, film, win, politics, together, strongly, opposition2%
9State propagandistswatch, begin, real, Internet, listen, Soloviev, shock2%
2Discussing a videovideo, video clip, write, read, conclusion, delete, destiny2%
50Praising Navalny’s activityinvestigation, wait, shoot (video), bravo, close, super, awesome2%

 

One-off vs. prolific commenters

Commenters comprise a heterogeneous set that includes users who occasionally enter conversations and actively participate in ongoing discussions. What is the ratio of both groups to the whole set of commenters on Navalny’s YouTube videos? When do different types of users post comments? Specifically, I focus on the division of the time range under scrutiny into periods of high and low interest in Navalny.

First, the total number of one-off commenters prevails over those actively involved in the discussions. However, the latter group contributed eight times more to the production of comments (8,013,733 versus 966,580 comments, Figure 2).

 

Number of (A) commenters and (B) comments left by one-off and prolific commenters
 
Figure 2: Number of (A) commenters and (B) comments left by one-off and prolific commenters.

 

Second, most of the one-off commenters appeared in Navalny’s community when there was high interest in his activity (503,072 against 304,324) (Table 3).

Third, during a period of high interest in Navalny, prolific commenters posted messages on Navalny’s videos slightly less than during a period of low interest (Table 4).

I consider only those comments posted within a week of the video release. Videos, in turn, were grouped separately into those that were published during a period of high and low interest in Navalny. Therefore, the number of comments is less than that in Figure 3. I took a period within a week after the video release rather than the exact day of publication because in this case, I could capture the effect of the public discussion of Navalny’s investigations or the events around him. In other words, people who were not familiar with Navalny’s activities could come to his channel to become acquainted with the video, not immediately after its release, but with some time lag. When subsetting the comments corpus is based on the same day of the video release and posting of the comment, such a delayed reaction will not be fully captured [23].

 

Table 3: Number of comments left by one-off and prolific commenters in the period of high and low interest in Navalny, within a week after a video release.
 By one-off commentersBy prolific commenters
High interest in Navalny503,0723,307,696
Low interest in Navalny304,0323,854,308

 

 

Table 4: Number of prolific commenters and comments they posted depending on the period of interest in Navalny, within a week after a video release.
 High interest in NavalnyLow interest in Navalny
Commenters365,487379,126
Comments3,307,6963,854,308

 

Testing Hypothesis 1A

After presenting more general patterns of user-video interactions, I move on to test Hypothesis 1A, which tests how the level of public interest in Navalny’s activities is associated with the subsequent retention of commenters in Russia’s opposition leader’s YouTube community.

Figure 3 shows an extremely low level of commenter retention over a 15–month period. The share drops to about two percent in the second month, and in subsequent months, it goes even less than one percent (an average of 0.3 percent for periods of high interest in Navalny and 0.6 percent when this interest was less than average, see Tables D1 and D2 in Appendix D). Figure 4 shows a zoomed view of these dynamics. However, it is worth noting that such a low percentage of retention may be due to the approach to determining the sequence of comments — the emphasis on the calendar sequence. In the Appendix D, one can find a similar chart, but here the 15 months are taken not through calendar countdowns of the days after the first interaction with the content but through the emphasis on 15 months of commenting as such. In this case, 15 months can represent a much larger time range because Navalny’s team can take a break in video production. This was typical, for example, for 2013–2015, when videos were released less often. Therefore, it also makes sense to look at Figure D1 in Appendix D.

Generally, in the first few months after entering the discussion, those who did this in the period of high interest in Navalny were more active than users from the low-interest group (Figures 3, 4, and D1 in Appendix D). However, representatives of the latter group remained in the community for a longer time. At the 15-month level, when the focus was on the act of commenting rather than the calendar sequence, retention for the group of commenters who came during a period of high interest in Navalny was 4.6 percent (95 percent bootstrapped confidence interval varied from 4.1 to 5.2 percent; see Table D7 in Appendix D) for more details). The second group had a retention of 9 percent (95 percent bootstrapped confidence interval is between 8.1 and 10.1 percent, more in Table D8 in Appendix D).

 

Retention of different types of commenters, with bootstrapped 95% confidence intervals
 
Figure 3: Retention of different types of commenters, with bootstrapped 95% confidence intervals.

 

Thus, observations from both versions of retention analysis for the whole conglomerate of commenters confirm the hypothesized relationship. Engaging with Navalny’s content when interest in him is high is less likely to form a long-term relationship with his community on YouTube compared to a period of low interest.

 

Retention of different types of commenters, with bootstrapped 95% confidence intervals, zoomed view
 
Figure 4: Retention of different types of commenters, with bootstrapped 95% confidence intervals, zoomed view.

 

Testing Hypothesis 1B

I record the same dynamics when the focus is on commenters who entered the discussion using pro-government and, thus, anti-opposition cues (Figures 5 and D2 in Appendix D). Users from high-interest periods are more likely to leave the community in the long run than their peers from low-interest periods.

 

Retention of commenters using pro-government cues, with bootstrapped 95% confidence intervals
 
Figure 5: Retention of commenters using pro-government cues, with bootstrapped 95% confidence intervals.

 

The confidence intervals overlap even less in the case of pro-government commenters. The difference in retention was statistically significant after the fifth month. In the fifteenth month, the retention of those who came during the period of high interest in Navalny was 3.3 percent (95 percent confidence interval lies between 2.7 and 4 percent; see Table D3 in Appendix D). The retention rate in the other group was 10.4 percent. The 95 percent confidence interval ranged from 7.7 to 13.5 percent (Table D4 in Appendix D). In Appendix D, Tables D9 and D10 and Figure D2 in Appendix D present the same retention analysis, but when the 15 months are taken through the emphasis on commenting, not the calendar sequence. These supplementary materials demonstrate the same dynamics as those in the main analysis. Thus, Hypothesis 1 B was confirmed.

Testing Hypothesis 1C

Commenters who make derogatory references to the government in their first post are also retained in Navalny’s community more often when this interaction occurs during a period of low interest in the politician (Figures 6 and D3 in Appendix D). Therefore, in favor of the no-hype period, there is a two-fold difference in the average values for the fifteenth month (10.8 percent versus 5.6 percent, see Tables D5 and D6 in Appendix D for more information on confidence intervals). The modification of retention analysis with a focus on the fact of commenting rather than the calendar sequence is presented at Tables D11, D12, and Figure D3 in Appendix D. In this regard, we see a similar trend, as in the case of YouTube commenters in general (Hypothesis 1A) and those who enter discussions using pro-government cues (Hypothesis 1 B).

 

Retention of commenters using pro-opposition cues, with bootstrapped 95% confidence intervals
 
Figure 6: Retention of commenters using pro-opposition cues, with bootstrapped 95% confidence intervals.

 

Toxicity of comments

Research question 2 about the extent of toxicity in discussions taking place on Navalny’s YouTube channel requires establishing a baseline at the level of incivility and how discussions in Navalny’s community differ from it. Figures 7 and F1 in Appendix F compare the toxicity of discussions on Navalny’s channel with a sample of comments from an apolitical celebrity channel, Evening Urgant. Overall, the level of incivility was much higher in Navalny’s community, which is reflected in all three types of comments. On average, top-level comments with threads are more uncivil than messages posted in threads and top-level comments that do not open a discussion. I also conducted a formal test for statistical significance using the non-parametric Kruskal-Wallis test for the analysis of variance and Dunn’s test of multiple comparisons, which confirm this conclusion about the statistical significance of the results presented in Figure 9. The Kruskal-Wallis test statistic was 101,454 with 5 degrees of freedom, yielding a p-value of 0.00000000000000022. Pairwise comparisons using Dunn’s test indicated that the toxicity scores of three types of comments posted on Navalny’s community were observed to be significantly different and higher than those posted on Urgant’s YouTube channel. Thus, political discussions in Navalny’s community carry a higher toxicity level than in an apolitical celebrity.

 

Retention of commenters using pro-opposition cues, with bootstrapped 95% confidence intervals
 
Figure 7: Toxicity in comments of Navalny’s YouTube and apolitical — Evening Urgant — channel.

 

 

++++++++++

Discussion and conclusion

This descriptive analysis of discussions in the YouTube community of Alexei Navalny is driven by (a) his specific role as a stressor of the political regime, who was the only political actor capable of organizing a nationwide protest movement in 2013–2021, and (b) the role of the platform itself in the Russian media system, which continues to be a haven for political outsiders to broadcast alternative information to a broader audience.

As results of structural topic modeling showed, commenters consider promoting Navalny’s content among other YouTube users and wider audiences as an important effort to gain more like-minded people and eventually undermine the public support of the regime. Based on the comparison of the toxicity in comments posted on Navalny’s channel with the popular entertainment channel Evening Urgant, I demonstrate how discussions on Navalny’s community are more uncivil than on an apolitical celebrity channel.

One-off commenters (i.e., those who write a comment only once and do not contribute to a discussion) outnumbered those who commented more often, but the latter category of commenters produced eight times more comments. If one-off commenters were involved in discussions more often during periods of high interest in Navalny’s activities, prolific commenters joined predominantly when public interest in the Russian opposition leader was below average.

Retention analysis of two cohorts (those who started commenting in Navalny’s community when interest in him was high and it was not) showed that affective attunement around socially important events was less likely to keep newcomers in the community in the long run. I found that the cohort of commenters who first engaged with Navalny’s content during a period of high public interest in the politician’s activities was less likely to stay in the community over the next 15 months. Conversely, those who started commenting when there was no high interest in Navalny in the long run linger more in the community of Russia’s most vocal opposition politician. Opposition critics and supporters show similar trends. Users who joined during a period of high interest in Navalny stayed in his community less frequently than those who did so during a period of relatively low interest from the general public.

Even though I do not study the causes and mechanism of such behavior, I can speculate based on theoretical premises that were proposed by researchers earlier. The hype around Navalny and the associated views of his main media channel — a YouTube channel — characterizes the process of affective attunement, which, by definition, cannot last long but can be intensive. This intensity is expressed primarily in the practice of commenting and framed as connective effervescence (Ventura, et al., 2021) when people, following socially important events, express support for one side or another in streaming chats and act as a kind of fan crowd. Their comments tend to be short, repetitive, and have a limited potential for deliberation. Although my focus is not on live debates but on high-quality investigative content intended to have far-reaching political implications, connective effervescence can still be seen. As shown by the results of text analysis of comments, appeals for coordinated actions to promote a video in the trending tab of YouTube is one of the most common topics. Thus, users are ready to mobilize to promote a given video and break an informational barrier built by authorities. This observation may be interpreted through the prism of a phenomenon called slacktivism (Morozov, 2011) when social media simultaneously create the illusion of involvement by applying little effort to act collectively (to put the “like” button or write a comment to promote the video on the recommendation system of YouTube). In this regard, one may doubt the effectiveness of Navalny’s active information campaign. Even though people watch Navalny’s content, this does not necessarily have far-reaching consequences, because the level of political apathy in society remains high.

However, this interpretation is not satisfactory, taking into account subsequent research on slacktivism, which casts doubt on the negative connotations of this phenomenon (Christensen, 2011; Jones, 2015; Madison and Klang, 2020). Not all forms of online activity should be considered the same (Lane and Cin, 2017) because they can be relevant to a specific context without any connotations about their deficiency. Moreover, there is heterogeneity in the perceptions of social media’s influence on politics by different individuals and their subsequent participation in off-line political actions (Kwak, et al., 2018). Navalny’s team comprehends that the information hype caused by the release of anti-corruption investigations does not instantly create a community ready for collective action to change the political regime. The head of Navalny’s regional campaign headquarters for the 2018 presidential elections, Leonid Volkov, clearly articulates that they consider their organizational and information activities as a gradient of opportunities to participate in politics: “I have visualized it many times at meetings for volunteers as such a pyramid: a million supporters; one hundred thousand of them are people who can donate; ten thousand of them are those who can agitate; one thousand will come to our regional campaign headquarters; one hundred are ready to receive an administrative arrest; ten are ready to go to prison, and only one is Alexei Navalny” [24].

Affective attunement does not necessarily create a community of active supporters, and does not significantly reduce the costs of collective action. But it allows the passage of the propaganda barrier and information restrictions faced by independent activists and politicians in autocracies. As a retrospective overview of the evolution of the movement created by Alexei Navalny shows, this information work and hype eventually led thousands of activists to become involved in the political activity of a nationwide movement in a geographically sparse country such as Russia. At the same time, such affective attunement eventually led to greater polarization in Russian politics. Subsequently, this was manifested in the way the authorities dealt with both Alexei Navalny himself and his movement, whose main activists were forced to leave the country; some of them were put in jail, and the movement itself was declared extremist.

This descriptive analysis provides new perspectives for future research. Despite efforts to collect as many keywords as possible and validate the results (iterated computer-assisted keyword selection, bootstrapping), I admit the naivety of the keyword-based approach with a focus on derogatory words to politicians. One possible direction is to employ sophisticated machine-learning techniques and a network analysis toolkit to study the pro-government and opposition-minded camps of commenters. Moreover, I consider a promising direction for future research to examine the association between off-line events and online behavior through causal inference identification strategies. Then, I indirectly study the role of emotions from the actors’ perspective, since this is the basis for affective polarization in authoritarian regimes, where confrontation takes place between the government and the opposition. But as the next step, it is also necessary to provide a more granular analysis with a focus on positive and negative emotions that users express about the government and opposition. End of article

 

About the author

Aidar Zinnatullin is a Ph.D. candidate in the Department of Political and Social Sciences at the University of Bologna (Italy).
E-mail: aidar [dot] zinnatullin2 [at] unibo [dot] it

 

Notes

1. Dean, 2010, p. 22.

2. Основные источники информации россиян (in Russian) — Main information sources of Russians at https://www.levada.ru/2022/11/03/osnovnye-istochniki-informatsii-rossiyan/, accessed 4 November 2022.

3. Belinskaya, 2021, p. 678.

4. Bastos, et al., 2013, p. 269.

5. Новый худший год. Главное из доклада "Агоры" о свободе интернета в России (in Russian) — New bad year. Key takeaways from Agora’s report on Internet freedom in Russia, at https://www.bbc.com/russian/features-47123831, accessed 1 November 2022; Свобода интернета 2019: план "Крепость". Совместный доклад Агоры и Роскомсвободы (in Russian) — Internet Freedom 2019 in Russia: Fortress Plan. A joint report by Agora and Roskomsvoboda, at https://2019.runet.report, accessed 1 November 2022.

6. Социальные сети в России: цифры и тренды, осень 2019 (in Russian) — Social media in Russia: data for autumn 2019, at https://br-analytics.ru/blog/social-media-russia-2019/, accessed 1 November 2022.

7. ТОП-20 микроблогов в Twitter - август 2020 (in Russian) — Top 20 bloggers on Twitter, August 2020 (in Russian), at https://www.mlg.ru/ratings/socmedia/twitter/7718/, accessed 1 November 2022.

8. Кто устроил охоту на Навального (in Russian) — Who initiated the hunt for Navalny, at https://istories.media/investigations/2020/09/17/kto-ustroil-okhotu-na-navalnogo/, accessed 31 October 2022.

9. Gel’man, 2015, p. 181.

10. Протесты и Навальный (in Russian) — Protests and Navalny, at https://www.levada.ru/2017/07/17/protesty-i-navalnyj/, accessed 1 November 2022.

11. Most used social media platforms in Russia as of October 2021, by monthly publications, at https://www.statista.com/statistics/284447/russia-social-network-penetration/, accessed 1 November 2022.

12. Рейтинг Медиалогии (in Russian) — Medialogia ratings, at https://www.mlg.ru/ratings/socmedia/youtube/10332/, accessed 1 November 2022.

13. Alizada, et al., 2022, p. 31.

14. Comments for the video about Putin’s palace (https://www.youtube.com/watch?v=ipAnwilMncI) were excluded for several steps of analyses because of improper comments thread structure. The API of the platform did not discern comments into thread and top-level messages. Such an exclusion does not severely affect the representation of discussions.

15. CaoJuan2009 is named after an article where a group of authors suggested finding a minimum of the average cosine distance of topics (Cao, et al., 2009) to define topics’ number choice. Deveaud2014, as a metric, deals with information divergence between all pairs of LDA topics and maximizes it (Deveaud, et al., 2014). Arun2010 aims to find the minimum in the symmetric Kullback-Leibler divergence of values in topic-word and document-topic matrices outputs of LDA (Arun, et al., 2010). Griffiths2004 used the Markov chain Monte Carlo algorithm and found the maximum log-likelihood of word occurrence to define a k number of topics to be used in the analysis (Griffiths and Steyvers, 2004).

16. I use a R wrapper provided by Votta (2019).

17. “About the API,” at https://developers.perspectiveapi.com/s/about-the-api, accessed 2 November 2022.

18. “FAQs,” at https://developers.perspectiveapi.com/s/about-the-api-faqs, accessed 2 November 2022.

19. Topics 5, 41, 50: ‘свободу Алексею Навальному!’(‘Freedom for Alexei Navalny’).

20. Topics 9, 16: ‘я думал в россии больше нечего воровать, а оказывается есть. и надо всех сажать кто в думе сидит. сколько они могут там сидеть’(I thought there was nothing more to steal in Russia, but it turns out there is. And it is necessary to jail everyone who is sitting in Parliament. How long can they sit there?

21. Topic 11: ‘люди, долго вы будете это все терпеть?! восстаньте! захватите власть, свергните всех власте-пренадлежащих людей, всех коррупционеров, лжецов, прохиндеев. пусть в россии будет хорошо без них и даже лучше! свергнуть власть!’(People, how long will you endure this? Rise! Seize power, overthrow all people in power, all corrupt officials, liars, and swindlers. Let it be good in Russia without them and even better! Overthrow the government!)

22. The trending tab displays the most relevant videos for the audience in a country; it is the same for all users from a particular country.

23. In Appendix C (Tables C1 and C2), information on the distribution of comments and commenters based on a video — without taking into account the month when a comment was posted — is also presented. In general, the trend described in Tables 5 and 6 is also observed in this case.

24. “Базовая инфраструктура протеста никуда не денется” Константин Гаазе говорит с Леонидом Волковым о разгроме штабов Навального. Это подкаст, но мы его для вас расшифровали (да!)(“The basic infrastructure of the protest is not going anywhere” (in Russian) — Konstantin Gaaze speaks with Leonid Volkov about the defeat of Navalny’s headquarters. This is a podcast, but we transcribed it for you (yes!), at https://meduza.io/feature/2021/04/29/bazovaya-infrastruktura-protesta-nikuda-ne-denetsya, accessed 29 October 2022.

 

References

Nazifa Alizada, Vanessa Alexandra Boese, Martin Lundstedt, Kelly Morrison, Natalia Natsika, Yuko Sato, Hugo Tai, and Staffan I. Lindberg, 2022. “Democracy report 2022L Autocratization changing nature?” V-Dem Institute, University of Gothenburg, at https://v-dem.net/media/publications/dr_2022.pdf, accessed 17 November 2022.

Ala’ Alrababa’h and Lisa Blaydes, 2020. “Authoritarian media and diversionary threats: Lessons from 30 years of Syrian state discourse,” Political Science Research and Methods, volume 9, number 4, pp. 693–708.
doi: https://doi.org/10.1017/psrm.2020.28, accessed 17 November 2022.

R. Arun, V. Suresh, C.E. Veni Madhavan, and M.N. Narasimha Murthy, 2010. “On finding the natural number of topics with latent dirichlet allocation: Some observations,” In: Mohammed J. Zaki, Jeffrey Xu Yu, B. Ravindran, and Vikram Pudi (editors). Advances in knowledge discovery and data mining, part 1. Lecture Notes in Computer Science, volume 6118. Berlin: Springer, pp. 391–402.
doi: https://doi.org/10.1007/978-3-642-13657-3_43, accessed 18 November 2022.

Selim Erdem Aytaç, 2021. “Effectiveness of incumbent’s strategic communication during economic crisis under electoral authoritarianism: Evidence from Turkey,” American Political Science Review, volume 115, number 4, pp. 1,517–1,523.
doi: https://doi.org/10.1017/s0003055421000587, accessed 18 November 2022.

Christopher A. Bail, Lisa P. Argyle, Taylor W. Brown, John P. Bumpus, Haohan Chen, M. B. Fallin Hunzaker, Jaemin Lee, Marcus Mann, Friedolin Merhout, and Alexander Volfovsky, 2018. “Exposure to opposing views on social media can increase political polarization,” Proceedings of the National Academy of Sciences, volume 115, number 37 (28 August), pp. 9,216–9,221.
doi: https://doi.org/10.1073/pnas.1804840115, accessed 18 November 2022.

Eytan Bakshy, Solomon Messing, and Lada A. Adamic, 2015. “Exposure to ideologically diverse news and opinion on Facebook,” Science, volume 348, number 6239 (7 May), pp. 1,130–1,132.
doi: https://doi.org/10.1126/science.aaa1160, accessed 18 November 2022.

Pablo Barbera, John T. Jost, Jonathan Nagler, Joshua A. Tucker, and Richard Bonneau, 2015. “Tweeting from left to right: Is online political communication more than an echo chamber?” Psychological Science, volume 26, number 10, pp. 1,531–1,542.
doi: https://doi.org/10.1177/0956797615594620, accessed 18 November 2022.

Marco Bastos, Rafael Luis Galdini Raimundo, and Rodrigo Travitzki, 2013. “Gatekeeping Twitter: Message diffusion in political hashtags,” Media, Culture & Society, volume 35, number 2, pp. 260–270.
doi: https://doi.org/10.1177/0163443712467594, accessed 18 November 2022.

Yulia Belinskaya, 2021. “The ghosts Navalny met: Russian YouTube-sphere in check,” Journalism and Media, volume 2, number 4, pp. 674–696.
doi: https://doi.org/10.3390/journalmedia2040040, accessed 18 November 2022.

Lance W. Bennett and Alexandra Segerberg, 2012. “The logic of connective action: Digital media and the personalization of contentious politics,” Information, Communication & Society, volume 15, number 5, pp. 739–768.
doi: https://doi.org/10.1080/1369118x.2012.670661, accessed 18 November 2022.

Darya Bogoradnikova, Olesia Makhnytkina, Anton Matveev, Anastasia Zakharova, and Artem Akulov, 2021. “Multilingual sentiment analysis and toxicity detection for text messages in Russian,” 2021 29th Conference of Open Innovations Association (FRUCT).
doi: https://doi.org/10.23919/fruct52173.2021.9435584, accessed 18 November 2022.

Michael Bratton and Nicolas van de Walle, 1994. “Neopatrimonial regimes and political transitions in Africa,” World Politics, volume 46, number 4, pp. 453–489.
doi: https://doi.org/10.2307/2950715, accessed 18 November 2022.

Juan Cao, Tian Xia, Jintao Li, Yongdong Zhang, and Sheng Tang, 2009. “A density-based method for adaptive LDA model selection,” Neurocomputing, volume 72, numbers 7–9, pp. 1,775–1,781.
doi: https://doi.org/10.1016/j.neucom.2008.06.011, accessed 18 November 2022.

Henrik Serup Christensen, 2011. “Political activities on the Internet: Slacktivism or political participation by other means?” First Monday, volume 16, number 2.
doi: https://doi.org/10.5210/fm.v16i2.3336, accessed 18 November 2022.

Matteo Cinelli, Gianmarco De Francisci Morales, Alessandro Galeazzi, Walter Quattrociocchi, and Michele Starnini, 2021. “The echo chamber effect on social media,” Proceedings of the National Academy of Sciences, volume 118, number 9 (23 February), e2023301118.
doi: https://doi.org/10.1073/pnas.2023301118, accessed 18 November 2022.

Mark Davis, 2021. “The online anti-public sphere,” European Journal of Cultural Studies, volume 24, number 1, pp. 143–159.
doi: https://doi.org/10.1177/1367549420902799, accessed 8 August 2023.

Jodi Dean, 2010. “Affective networks,” Media Tropes eJournal, volume 2, number 2 (25 February), at https://mediatropes.com/index.php/Mediatropes/article/view/11932, accessed 18 November 2022.

Romain Deveaud, Éric SanJuan, and Patrice Bellot, 2014. “Accurate and effective latent concept modeling for ad hoc information retrieval,” Document Numérique, volume 17, number 1, pp. 61–84.
doi: https://doi.org/10.3166/dn.17.1.61-84, accessed 18 November 2022.

Jayson L. Dibble and Sarah F. Rosaen, 2011. “Parasocial interaction as more than friendship,” Journal of Media Psychology, volume 23, number 3, pp. 122–132.
doi: https://doi.org/10.1027/1864-1105/a000044, accessed 18 November 2022.

Jan Matti Dollbaum, Andrey Semenov, and Elena Sirotkina, 2018. “A top-down movement with grass-roots effects? Alexei Navalny’s electoral campaign,” Social Movement Studies, volume 17, number 5, pp. 618–625.
doi: https://doi.org/10.1080/14742837.2018.1483228, accessed 18 November 2022.

Ruben Enikolopov, Alexey Makarin, and Maria Petrova, 2020. “Social media and protest participation: Evidence from Russia,” Econometrica, volume 88, number 4, pp. 1,479–1,514.
doi: https://doi.org/10.3982/ecta14281, accessed 18 November 2022.

Ruben Enikolopov, Alexey Makarin, and Maria Petrova, 2018. “Social image, networks, and protest participation,” SSRN Electronic Journal (24 May).
doi: https://doi.org/10.2139/ssrn.2940171, accessed 18 November 2022.

Vladimir Gel’man, 2015. “Political opposition in Russia: A troubled transformation,” Europe-Asia Studies, volume 67, number 2, pp. 177–191.
doi: https://doi.org/10.1080/09668136.2014.1001577, accessed 18 November 2022.

Thomas L. Griffiths and Mark Steyvers, 2004. “Finding scientific topics,” Proceedings of the National Academy of Sciences, volume 101, supplement 1 (6 April), pp. 5,228–5,235.
doi: https://doi.org/10.1073/pnas.0307752101, accessed 18 November 2022.

Justin Grimmer and Brandon M. Stewart, 2013. “Text as data: The promise and pitfalls of automatic content analysis methods for political texts,” Political Analysis, volume 21, number 3, pp. 267–297.
doi: https://doi.org/10.1093/pan/mps028, accessed 18 November 2022.

Justin Grimmer, Brandon M. Stewart, and Margaret E. Roberts. 2022. Text as data: A new framework for machine learning and the social sciences. Princeton, N.J.: Princeton University Press.

Seva Gunitsky, 2015. “Corrupting the cyber-commons: Social media as a tool of autocratic stability,” Perspectives on Politics, volume 13, number 1, pp. 42–54.
doi: https://doi.org/10.1017/s1537592714003120, accessed 18 November 2022.

Lei Guo, Jacob A. Rohde, and H. Denis Wu, 2018. “Who is responsible for Twitter’s echo chamber problem? Evidence from 2016 U.S. election networks,” Information, Communication & Society, volume 23, number 2, pp. 234–251.
doi: https://doi.org/10.1080/1369118x.2018.1499793, accessed 18 November 2022.

Sergei Guriev and Daniel Treisman, 2022. Spin dictators: The changing face of tyranny in the 21st century. Princeton, N.J.: Princeton University Press.

Daniel Halpern and Jennifer Gibbs, 2013. “Social media as a catalyst for online deliberation? Exploring the affordances of Facebook and YouTube for political expression,” Computers in Human Behavior, volume 29, number 3, pp. 1,159–1,168.
doi: https://doi.org/10.1016/j.chb.2012.10.008, accessed 18 November 2022.

Tal Orian Harel, Jessica Katz Jameson, and Ifat Maoz, 2020. “The normalization of hatred: Identity, affective polarization, and dehumanization on Facebook in the context of intractable political conflict,” Social Media + Society (12 May).
doi: https://doi.org/10.1177/2056305120913983, accessed 18 November 2022.

William R. Hobbs and Margaret E. Roberts, 2018. “How sudden censorship can increase access to information,” American Political Science Review, volume 112, number 3, pp. 621–636.
doi: https://doi.org/10.1017/s0003055418000084, accessed 19 November 2022.

Daniel J. Hopkins and Gary King, 2010. “A method of automated nonparametric content analysis for social science,” American Journal of Political Science, volume 54, number 1, pp. 229–247.
doi: https://doi.org/10.1111/j.1540-5907.2009.00428.x, accessed 19 November 2022.

Hyunseo Hwang, Youngju Kim, and Catherine U. Huh, 2014. “Seeing is believing: Effects of uncivil online debate on political polarization and expectations of deliberation,” Journal of Broadcasting & Electronic Media, volume 58, number 4, pp. 621–633.
doi: https://doi.org/10.1080/08838151.2014.966365, accessed 19 November 2022.

Shanto Iyengar and Sean J. Westwood, 2015. “Fear and loathing across party lines: New evidence on group polarization,” American Journal of Political Science, volume 59, number 3, pp. 690–707.
doi: https://doi.org/10.1111/ajps.12152, accessed 19 November 2022.

Shanto Iyengar, Gaurav Sood, and Yphtach Lelkes, 2012. “Affect, not ideology: A social identity perspective on polarization,” Public Opinion Quarterly, volume 76, number 3, pp. 405–431.
doi: https://doi.org/10.1093/poq/nfs038, accessed 19 November 2022.

Eric Jardine, 2016. &rlquo;Tor, what is it good for? Political repression and the use of online anonymity-granting technologies,” New Media & Society, volume 20, number 2, pp. 435–452.
doi: https://doi.org/10.1177/1461444816639976, accessed 19 November 2022.

Cat Jones, 2015. “Slacktivism and the social benefits of social video: Sharing a video to ‘help’ a cause,” First Monday, volume 20, number 5.
doi: https://doi.org/10.5210/fm.v20i5.5855, accessed 19 November 2022.

Rune Karlsen, Kari Steen-Johnsen, Dag Wollebæk, and Bernard Enjolras, 2017. “Echo chamber and trench warfare dynamics in online debates,” European Journal of Communication, volume 32, number 3, pp. 257–273.
doi: https://doi.org/10.1177/0267323117695734, accessed 19 November 2022.

Gary King, Patrick Lam, and Margaret E. Roberts, 2017. “Computer-assisted keyword and document set discovery from unstructured text,” American Journal of Political Science, volume 61, number 4, pp. 971–988.
doi: https://doi.org/10.1111/ajps.12291, accessed 19 November 2022.

Gary King, Jennifer Pan, and Margaret E. Roberts, 2014. “Reverse-engineering censorship in China: Randomized experimentation and participant observation,” Science, volume 345, number 6199 (22 August), 1251722.
doi: https://doi.org/10.1126/science.1251722, accessed 19 November 2022.

Gary King, Jennifer Pan, and Margaret E. Roberts, 2013. “How censorship in China allows government criticism but silences collective expression,” American Political Science Review, volume 107, number 2, pp. 326–343.
doi: https://doi.org/10.1017/s0003055413000014, accessed 19 November 2022.

Nojin Kwak, Daniel S. Lane, Brian E. Weeks, Dam Hee Kim, Slgi S. Lee, and Sarah Bachleda, 2018. “Perceptions of social media for politics: Testing the slacktivism hypothesis,” Human Communication Research, volume 44, number 2, pp. 197–221.
doi: https://doi.org/10.1093/hcr/hqx008, accessed 19 November 2022.

Melis G. Laebens and Aykut Öztürk, 2020. “Partisanship and autocratization: Polarization, power asymmetry, and partisan social identities in Turkey,” Comparative Political Studies, volume 54, number 2, pp. 245–279.
doi: https://doi.org/10.1177/0010414020926199, accessed 19 November 2022.

Daniel S. Lane and Sonya Dal Cin, 2017. “Sharing beyond Slacktivism: The effect of socially observable prosocial media sharing on subsequent offline helping behavior,” Information, Communication & Society, volume 21, number 11, pp. 1,523–1,540.
doi: https://doi.org/10.1080/1369118x.2017.1340496, accessed 18 November 2022.

Marlene Laruelle, 2014. “Alexei Navalny and challenges in reconciling ‘nationalism’ and ‘liberalism’,” Post-Soviet Affairs, volume 30, number 4, pp. 276–297.
doi: https://doi.org/10.1080/1060586x.2013.872453, accessed 19 November 2022.

Matthew Levendusky, 2013. “Partisan media exposure and attitudes toward the opposition,” Political Communication, volume 30, number 4, pp. 565–581.
doi: https://doi.org/10.1080/10584609.2012.737435, accessed 19 November 2022.

Anna Litvinenko, 2021. “YouTube as alternative television in Russia: Political videos during the presidential election campaign 2018,” Social Media + Society (16 March).
doi: https://doi.org/10.1177/2056305120984455, accessed 19 November 2022.

Nora Madison and Mathias Klang, 2020. “The case for digital activism,” Journal of Digital Social Research, volume 2, number 2, pp. 28–47.
doi: https://doi.org/10.33621/jdsr.v2i2.25, accessed 19 November 2022.

Daniel Maier, Andreas Niekler, Gregor Wiedemann, and Daniela Stoltenberg, 2020. “How document sampling and vocabulary pruning affect the results of topic models,” Computational Communication Research, volume 2, number 2, pp. 139–5152, and at https://computationalcommunication.org/ccr/article/view/32, accessed 19 November 2022.

Mykola Makhortykh, Aleksandra Urman, and Mariëlle Wijermars, 2022. “A story of (non)compliance, bias, and conspiracies: How Google and Yandex represented Smart Voting during the 2021 parliamentary elections in Russia,” Harvard Kennedy School Misinformation Review (7 March).
doi: https://doi.org/10.37016/mr-2020-94, accessed 19 November 2022.

Andrew Cesare Miller, 2022. “#DictatorErdogan: How social media bans trigger backlash,” Political Communication, volume 39, number 6, pp. 801–825.
doi: https://doi.org/10.1080/10584609.2022.2109084, accessed 19 November 2022.

David Mimno and Moontae Lee, 2014. “Low-dimensional embeddings for interpretable anchor-based topic inference,” Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1,319–1,328.
doi: https://doi.org/10.3115/v1/d14-1138, accessed 19 November 2022.

Natalia Moen-Larsen, 2014. “‘Normal nationalism’: Alexei Navalny, LiveJournal and ‘the other’,” East European Politics, volume 30, number 4, pp. 548–567.
doi: https://doi.org/10.1080/21599165.2014.959662, accessed 19 November 2022.

Juan S. Morales, 2019. “Perceived popularity and online political dissent: Evidence from Twitter in Venezuela,” International Journal of Press/Politics, volume 25, number 1, pp. 5–27.
doi: https://doi.org/10.1177/1940161219872942, accessed 19 November 2022.

Evgeny Morozov, 2011. Net delusion: The dark side of Internet freedom. New York: PublicAffairs.

Ashley Muddiman and Natalie Jomini Stroud, 2017. “News values, cognitive biases, and partisan incivility in comment sections,” Journal of Communication, volume 67, number 4, pp. 586–609.br>doi: https://doi.org/10.1111/jcom.12312, accessed 19 November 2022.

Nikita Murzintcev and Nathan Chaney, 2020. “Ldatuning: Tuning of the Latent Dirichlet Allocation models parameters,” at https://cran.r-project.org/package=ldatuning, accessed 19 November 2022.

Elizabeth R. Nugent, 2020. “The psychology of repression and polarization,” World Politics, volume 72, number 2, pp. 291–334.
doi: https://doi.org/10.1017/s0043887120000015, accessed 19 November 2022.

Mancur Olson, 1971. The logic of collective action: Public goods and the theory of groups. Cambridge, Mass.: Harvard University Press.
doi: https://doi.org/10.2307/j.ctvjsf3ts, accessed 19 November 2022.

Tim O’Reilly, 2005. “What is Web 2.0. Design patterns and business models for the next generation of software” (30 September), at https://www.oreilly.com/pub/a/web2/archive/what-is-web-20.html, accessed 19 November 2022.

Robert W. Orttung and Elizabeth Nelson, 2018. “Russia Today’s strategy and effectiveness on YouTube,” Post-Soviet Affairs, volume 35 number 2, pp. 77–92.
doi: https://doi.org/10.1080/1060586x.2018.1531650, accessed 19 November 2022.

Jennifer Pan and Alexandra A. Siegel, 2019. “How Saudi crackdowns fail to silence online dissent,” American Political Science Review, volume 114, number 1, pp. 109–125.
doi: https://doi.org/10.1017/s0003055419000650, accessed 19 November 2022.

Zizi Papacharissi, 2015. “Affective publics and structures of storytelling: Sentiment, events and mediality,” Information, Communication & Society, volume 19, number 3, pp. 307–324.
doi: https://doi.org/10.1080/1369118x.2015.1109697, accessed 19 November 2022.

Zizi Papacharissi, 2014. Affective publics: Sentiment, technology, and politics. New York: Oxford University Press.
doi: https://doi.org/10.1093/acprof:oso/9780199999736.001.0001, accessed 19 November 2022.

Eli Pariser, 2011. The filter bubble: What the Internet is hiding from you. New York: Penguin.

Perrine Poupin, 2021. “Social media and state repression: The case of VKontakte and the anti-garbage protest in Shies, in far northern Russia,” First Monday, volume 26, number 5.
doi: https://doi.org/10.5210/fm.v26i5.11711, accessed 19 November 2022.

Sarah Repucci and Amy Slipowitz, 2022. “Freedom in the world 2022: The global expansion of authoritarian rule,” Freedom House, at https://freedomhouse.org/report/freedom-world/2022/global-expansion-authoritarian-rule, accessed 19 November 2022.

Bernhard Rieder, 2015. “Youtube data tools (version 1.22),” at https://tools.digitalmethods.net/netvizz/youtube/, accessed 19 November 2022.

Alexander Rihl and Claudia Wegener, 2017. “YouTube celebrities and parasocial interaction: Using feedback channels in mediatized relationships,” Convergence, volume 25, number 3, pp. 554–566.
doi: https://doi.org/10.1177/1354856517736976, accessed 19 November 2022.

Margaret Roberts, 2018. Censored: Distraction and diversion inside China’s great firewall. Princeton, N.J.: Princeton University Press.
doi: https://doi.org/10.23943/9781400890057, accessed 19 November 2022.

Arturas Rozenas and Denis Stukal, 2019. “How autocrats manipulate economic news: Evidence from Russia’s state-controlled television,” Journal of Politics, volume 81, number 3, pp. 982–996.
doi: https://doi.org/10.1086/703208, accessed 19 November 2022.

Alan M. Rubin, Elizabeth M. Perse, and Robert A. Powell, 1985. “Loneliness, parasocial interaction, and local television news viewing,” Human Communication Research, volume 12, number 2, pp. 155–180.
doi: https://doi.org/10.1111/j.1468-2958.1985.tb00071.x, accessed 19 November 2022.

Sergey Sanovich, 2017. “Computational propaganda in Russia: The origins of digital misinformation,” University of Oxford, Computational Propaganda Project, Working Paper, number 2017.3, at https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/12/2017/06/Comprop-Russia.pdf accessed 19 November 2022.

Sergey Sanovich, Denis Stukal, and Joshua A. Tucker, 2018. “Turning the virtual tables: Government strategies for addressing online opposition with an application to Russia,” Comparative Politics, volume 50, number 3, pp. 435–482.
doi: https://doi.org/10.5129/001041518822704890, accessed 19 November 2022.

Regina Smyth and Irina V. Soboleva, 2016. “Navalny’s gamesters: Protest, opposition innovation, and authoritarian stability in Russia,” Russian Politics, volume 1, number 4, pp. 347–371.
doi: https://doi.org/10.1163/2451-8921-00104002, accessed 19 November 2022.

Anton Sobolev, 2019. “How pro-government ‘trolls’ influence online conversations in Russia,” at http://www.wpsanet.org/papers/docs/2019W-Feb-Anton-Sobolev-Trolls-VA.pdf, accessed 19 November 2022.

Viktoria Spaiser, Thomas Chadefaux, Karsten Donnay, Fabian Russmann, and Dirk Helbing, 2017. “Communication power struggles on social media: A case study of the 2011–12 Russian protests,” Journal of Information Technology & Politics, volume 14, number 2, pp. 132–153.
doi: https://doi.org/10.1080/19331681.2017.1308288, accessed 19 November 2022.

Natalie Jomini Stroud, 2007. “Media use and political predispositions: Revisiting the concept of selective exposure,” Political Behavior, volume 30, number 3, pp. 341–366.
doi: https://doi.org/10.1007/s11109-007-9050-9, accessed 19 November 2022.

Denis Stukal, Sergey Sanovich, Richard Bonneau, and Joshua A. Tucker, 2022. “Why botter: How pro-government bots fight opposition in Russia,” American Political Science Review, volume 116, number 3, pp. 843–857.
doi: https://doi.org/10.1017/s0003055421001507, accessed 19 November 2022.

Florian Toepfl, 2020. “Comparing authoritarian publics: The benefits and risks of three types of publics for autocrats,” Communication Theory, volume 30, number 2, pp. 105–125.
doi: https://doi.org/10.1093/ct/qtz015, accessed 19 November 2022.

Florian Toepfl, 2016. “Innovating consultative authoritarianism: Internet votes as a novel digital tool to stabilize non-democratic rule in Russia,” New Media & Society, volume 20, number 3, pp. 956–972.
doi: https://doi.org/10.1177/1461444816675444, accessed 19 November 2022.

Petter Törnberg, 2022. “How digital media drive affective polarization through partisan sorting,” Proceedings of the National Academy of Sciences, volume 119, number 42 (10 October), e2207159119.
doi: https://doi.org/10.1073/pnas.2207159119, accessed 19 November 2022.

Petter Törnberg and Justus Uitermark, 2021. “Tweeting ourselves to death: The cultural logic of digital capitalism,” Media, Culture & Society, volume 44, number 3, pp. 574–590.
doi: https://doi.org/10.1177/01634437211053766, accessed 19 November 2022.

Rodoula H. Tsiotsou, 2015. ”The role of social and parasocial relationships on social networking sites loyalty,” Computers in Human Behavior, volume 48, pp. 401–414.
doi: https://doi.org/10.1016/j.chb.2015.01.064, accessed 19 November 2022.

Mikhail Turchenko and Grigorii V. Golosov, 2020. “Smart enough to make a difference? An empirical test of the efficacy of strategic voting in Russia’s authoritarian elections,” Post-Soviet Affairs, volume 37, number 1, pp. 65–79.
doi: https://doi.org/10.1080/1060586x.2020.1796386, accessed 19 November 2022.

Cristian Vaccari and Augusto Valeriani, 2021. Outside the bubble: Social media and Political participation in Western democracies. Oxford: Oxford University Press.
doi: https://doi.org/10.1093/oso/9780190858476.001.0001, accessed 19 November 2022.

Tiago Ventura, Kevin Munger, Katherine McCabe, and Keng-Chi Chang, 2021. “Connective effervescence and streaming chat during political debates,” Journal of Quantitative Description: Digital Media, volume 1 (26 April).
doi: https://doi.org/10.51685/jqd.2021.001, accessed 19 November 2022.

Fabio Votta, 2019. “peRspective: A wrapper for the perspective API,” at https://github.com/favstats, accessed 17 November 2022.

Michael Wagner, 2002. “Publics and counterpublics,” Public Culture, volume 14, number 1, pp. 49–90.
doi: https://doi.org/10.1215/08992363-14-1-49, accessed 8 August 2023.

Asta Zelenkauskaite, 2022. Creating chaos online: Disinformation and subverted post-publics. Ann Arbor: University of Michigan Press.
doi: https://doi.org/10.3998/mpub.12237294, accessed 8 August 2023.

Asta Zelenkauskaite and Marcello Balduccini, 2017. “‘Information warfare’ and online news commenting: Analyzing forces of social influence through location-based user-commenting typology framework,” Social Media + Society (17 July).
doi: https://doi.org/10.1177/2056305117718468, accessed 4 August 2023.

 

Appendix A: Metrics of interest in Navalny.

 

Interest in Navalny according to visits to his Wikipedia page in Russian, only users
 
Figure A1: Interest in Navalny according to visits to his Wikipedia page in Russian, only users.

 

 

Interest in Navalny according to Google Trends
 
Figure A2: Interest in Navalny according to Google Trends.

 

 

Appendix B: Text analysis of comments.

 

Metrics from LDAtuning
 
Figure B1: Metrics from LDAtuning.

 

 

Table B1: Summary of STM, 61 topics.
TopicSummary of topicSeven FREX wordsShare
5Freedom for NavalnyNavalny, freedom, free, release, political, reality, estimate4.8%
3Promotion of videotop, keep, hooray, fire, go up, let’s go, maintaining4%
16Corruption among elitesPutin, stay (in jail), thieve, officiary, jail, regular, member of parliament2.8%
12Trending the videotrend, comment, show, raise, bring (to the top), send, promote2.7%
41Save Navalnyhelp, fear, hope, God, family, open, idiot2.5%
11Revolutionpeople, revolution, endure, come, enemy, organize, change2.4%
51Convince otherstell, film, win, politics, together, strongly, opposition2.3%
9State propagandistswatch, begin, real, Internet, listen, Soloviev, shock2,3%
2Discussing a videovideo, video clip, write, read, conclusion, delete, destiny2.3%
50Praising Navalny’s activityinvestigation, wait, shoot (video), bravo, close, super, awesome2.3%
40Corruptionmoney, pay, earn, buy, take, collect, steal2.2%
47Navalny’s attack on Putinpresident, support, give, personally, lie (mortality), die, palace2.2%
22Rule of law in Russialaw, court, citizen, Russian, right, accept, constitution2.1%
7Argumentsbelieve, (to put in) jail, history, want, nonsense, proof, homeland2%
24Regime’s evolutionbecome, full, occur, (of) Putin, agreeable, severely, horror2%
27Electionelections, vote, decide, visit (an election commission), honest, candidate, pass2%
33Corruptioncorruption, state, level, fight, create, fight, high1.9%
36Potential pro-government astroturfinglike, put, bot, deliver, forget, remove, check1.9%
53Criminal and authoritiesthief, crook, regime, united, party, murderer, Vladimir1.8%
45Money, taxesruble, tax, bill, price, cost, billion, business1.8%
14PoisoningFSB, panties, phone, turn out, have to, employee, Novichok1.8%
43Salaries, inequalitywork, receive, salary, month, site, medical doctor, work1.8%
1Positive reaction to a video and praising Navalnywell done, work, hold on, team, continue, forward, enormous1.7%
13Promotion of video #2comment, future, equal, leave, indifferent, youth, important1.7%
19Support of Navalnyunderstand, channel, answer, answer, support, similar, keep silent1.7%
58Attempt to kill Navalnykill, clown, poison, agent, sorry, Germany, normal1.7%
26Protest, collective actionsgo out, rally, call, police, protest, seek, action1.7%
4Praising Navalny #2health, (good) luck, right, handsome, save, hero, wish1.6%
20Elections, voting optionsvote, leave, bad, choose, Grudinin, ask, moreover1.6%
42Corruption in MoscowMoscow, city, apartment, rich, region, expensive, build1.6%
17Non-discerniblefact, problem, system, Russian, try, population, USSR1.6%
8Sergei Furgal case (in Khabarovsk region)power, example, return, lead, begin, Khabarovsk, demand1.5%
34Pensions, social policypension, war, poor, pensioner, age, grandfather, pension1.5%
49Non-discernible #2go, ass, topic, speech, school, plan, norm1.5%
10Views, YouTube’s statisticsview, million, freeze, start, recruit, log in, subscriber1.4%
31GeopoliticsUkraine, bad, kremlin, turn out, Europe, mister, Crimea1.4%
46Views, YouTube statistics #2thousand, quantity, number, get up, minimum, grow, seriously1.4%
21Promotion of video #3YouTube, promotion, hit, for the sake of, necessarily, drive, bluntly1.3%
60Non-discernible #3many, most importantly, offer, most, program, ready, like1.3%
59Non-discernible #4stay, normal, opinion, learn, couple, anyone, quickly1.3%
55Replies to other commentersAlexander, Sergey, Dmitry, Andrey, mouth, clear, Ivanov1.3%
30About criminal cases related to Navalnybusiness, engage, general, scary, forest, personal, correct1.2%
6Attacks on governmentgovernment, Medvedev, represent, head, rat, post, relative1.2%
56Non discernible #5information, situation, interesting, idea, search, late, data1.2%
28Navalny and the USUSA, name, thing, language, surname, American, anavalny (playing with Navalny’s surname)1.2%
35Non-discernible #6blunt, direct, by the way, attention, remember, blood, straight1.2%
39Young generationbrains, grab, instead, young, brain, generation, main1.1%
23Chance to changecome, carry, opportunity, thought, order, group, error1.1%
44Non-discernible #7real, king, excellent, conduct, great, sense, century1.1%
61Non-discernible #8any, simple, rest, ordinary, perhaps, be, reason1%
25Attacks on Navalnyfind, West, call, favor, low, specifically, prostitute0.9%
29Navalny’s teamguys, right, love, guy, man, brave, Sobol0.9%
37Collective actionsenough, action, act, crowd, active, body, role0.9%
32Luxury lifestyle of elite membersslave, plane, yacht, desire, be called, touch, method0.8%
38Smart votingsmart, vote, participate, global, territory, leader, item0.8%
52Non-discernible #9sell, apparently, yeah, run, budget, run, score0.8%
18Shaming someoneshame, conscience, lose, honor, fear, shame, human0.8%
54Non-discernible #10measure, relate, edge, mask, reach, serve, local0.8%
57Discussions about corruptionIvan, relax, smoke, villa, Italy, nature, express0.7%
15Heroes of Navalny’s investigationshouse, before, Kudryavtsev, neighbor, (Nastya) Rybka, maximum, known0.7%
48Wealthy eliteshonestly, want, argument, wealth, right, absolutely, ruler0.6%

 

 

STM analysis results of comments on Navalny’s YouTube channel, highest probability words
 
Figure B2: STM analysis results of comments on Navalny’s YouTube channel, highest probability words.

 

 

Appendix C: Identity of commenters.

 

Table C1: Number of comments left by one-off and prolific commenters in the period of high and low interest in Navalny, without restriction of being published within seven days after a video release.
 By one-off commentersBy prolific commenters
High interest in Navalny581,3373,679,408
Low interest in Navalny385,2434,334,325

 

 

Table C2: Number of prolific commenters and comments they posted depending on the period of interest in Navalny, without restriction of being published within seven days after a video release.
 High interest in NavalnyLow interest in Navalny
Commenters422,047469,917
Comments3,679,4084,334,325

 

 

Appendix D: Retention analysis for Hypotheses 1A, 1B, and 1C.

 

Table D1: Retention of commenters engaged in discussion when interest in Navalny was high, with bootstrapped 95 percent confidence intervals.
MonthMeanLowerUpper
1100%100%100%
21.6%1.4%1.8%
30.8%0.7%0.9%
40.9%0.8%1%
50.6%0.5%0.6%
60.5%0.4%0.6%
70.6%0.5%0.6%
80.5%0.4%0.6%
90.4%0.4%0.5%
100.5%0.4%0.5%
110.4%0.4%0.4%
120.3%0.3%0.4%
130.4%0.4%0.5%
140.3%0.2%0.3%
150.3%0.3%0.4%
More than 15 months2.2%2%2.4%

 

 

Table D2: Retention of commenters engaged in discussion when interest in Navalny was lower, with bootstrapped 95 percent confidence intervals.
MonthMeanLowerUpper
1100%100%100%
22.7%2.3%3.1%
31.6%1.2%2%
41.3%1.1%1.6%
50.8%0.6%0.9%
61.2%0.7%2.2%
70.8%0.7%0.9%
81.2%0.7%2.2%
91.2%0.7%2.2%
100.8%0.7%0.9%
110.6%0.5%0.9%
120.6%0.5%0.7%
130.6%0.5%0.7%
140.4%0.4%0.5%
150.6%0.5%0.9%
More than 15 months4.9%4.2%5.9%

 

 

Table D3: Retention of pro-government commenters engaged in discussion when interest in Navalny was high, with bootstrapped 95 percent confidence intervals.
MonthMeanLowerUpper
1100%100%100%
213.2%11.6%14.9%
311.3%9.8%12.9%
410.2%8.8%11.7%
58.5%7.3%9.7%
67.8%6.5%9.2%
76.8%5.8%7.8%
87%6.2%8%
97.4%6.1%8.7%
107.3%6.2%8.5%
117.2%6.2%8.2%
126.5%5.5%7.6%
135.7%4.8%6.8%
144%3.4%4.7%
153.3%2.7%4%
1624.4%22.5%26.4%

 

 

Table D4: Retention of pro-government commenters engaged in discussion when interest in Navalny was lower, with bootstrapped 95 percent confidence intervals.
MonthMeanLowerUpper
1100%100%100%
213.5%11.9%15.5%
312.6%10.7%14.8%
411.7%9.9%13.8%
511.6%9.7%14%
611.4%9.5%13.7%
712.3%10.4%14.4%
810.9%9.1%13.1%
910.5%8.7%12.8%
1010.9%8.9%13.2%
1110.2%8.4%12.4%
127.6%6.4%8.8%
1310.4%8.3%13%
149.6%7.7%11.9%
1510.4%7.7%13.5%
1632.6%29.8%35.5%

 

 

Table D5: Retention of opposition-minded commenters engaged in discussion when interest in Navalny was high, with bootstrapped 95 percent confidence intervals.
MonthMeanLowerUpper
1100%100%100%
217.5%16.1%18.9%
315.3%13.6%17.1%
412.7%11.4%13.9%
511%9.8%12.2%
610.2%9%11.5%
79.7%8.4%11.2%
810.2%8.7%12.1%
911%9.5%12.5%
1010.1%8.8%11.5%
1110.7%9.3%12.1%
128.4%7.3%9.5%
137%6%8.1%
146.4%5.3%7.7%
155.6%4.8%6.6%
1633.8%32%36.1%

 

 

Table D6: Retention of opposition-minded commenters engaged in discussion when interest in Navalny was lower, with bootstrapped 95 percent confidence intervals.
MonthMeanLowerUpper
1100%100%100%
217.2%15.7%18.7%
314.8%13.4%16.4%
414.2%12.7%15.8%
514.3%12.6%16.1%
613%11.7%14.3%
713%11.7%14.4%
813.5%12%15.2%
912.8%11.2%14.5%
1012.4%11.1%13.9%
1112.2%10.7%13.9%
1211.3%9.8%13.1%
1311.3%9.8%12.9%
1411.2%9.6%13.1%
1510.8%9.5%12.2%
1642.2%39.4%45%

 

 

Retention of different types of commenters, with bootstrapped 95% confidence intervals, months are defined by the act of commenting, not consecutive in calendar order
 
Figure D1: Retention of different types of commenters, with bootstrapped 95 percent confidence intervals, months are defined by the act of commenting, not consecutive in calendar order.

 

 

Table D7: Retention commenters engaged in discussion when interest in Navalny was high, with bootstrapped 95 percent confidence intervals, months are defined by the act of commenting, not consecutive in calendar order.
MonthMeanLowerUpper
1100%100%100%
216.2%15.1%17.3%
313.8%12.6%15.1%
411.8%10.9%12.8%
59.8%8.8%10.8%
69.1%8.3%9.9%
78.5%7.6%9.4%
87.9%7%8.8%
98.6%7.7%9.5%
108.3%7.3%9.4%
118.2%7.4%8.9%
127.4%6.7%8.2%
136.8%6%7.7%
145.3%4.6%6%
154.6%4.1%5.2%

 

 

Table D8: Retention of commenters engaged in discussion when interest in Navalny was lower, with bootstrapped 95 percent confidence intervals, months are defined by the act of commenting, not consecutive in calendar order.
MonthMeanLowerUpper
1100%100%100%
216%14.6%17.5%
311.6%10.4%13.1%
411.6%10.4%13%
512.1%10.8%13.5%
611.3%10%12.7%
710.9%9.8%12.2%
810.7%9.7%11.8%
910.2%9.3%11.3%
1010%9.1%11%
119.6%8.7%10.7%
129.3%8.3%10.3%
139.4%8.5%10.5%
149.6%8.6%10.7%
159%8.1%10.1%

 

 

Retention of commenters using pro-government cues, with bootstrapped 95% confidence intervals, months are defined by the act of commenting, not consecutive in calendar order
 
Figure D2: Retention of commenters using pro-government cues, with bootstrapped 95% confidence intervals, months are defined by the act of commenting, not consecutive in calendar order.

 

 

Table D9: Retention of pro-government commenters engaged in discussion when interest in Navalny was high, with bootstrapped 95 percent confidence intervals.
MonthMeanLowerUpper
1100%100%100%
215%13.4%16.6%
312.5%10.9%14.2%
411.1%9.8%12.6%
58.9%7.7%10.2%
67.7%6.8%8.8%
77.7%6.7%8.9%
87.4%6.4%8.5%
97.5%6.4%8.8%
107.1%6%8.3%
116.2%5.4%7%
126.5%5.5%7.5%
135.6%4.8%6.4%
144.7%4%5.5%
153.9%3.3%4.6%

 

 

Table D10: Retention of pro-government commenters engaged in discussion when interest in Navalny was lower, with bootstrapped 95 percent confidence intervals.
MonthMeanLowerUpper
1100%100%100%
220%17.5%22.9%
315.7%13.5%18.2%
414.1%12.1%16.3%
513.7%11.7%15.9%
613.9%11.7%16.3%
713.2%11.2%15.5%
812%10%14.3%
911.5%9.6%13.7%
1011.5%9.5%13.7%
119.7%8.2%11.4%
129.6%8%11.5%
139.9%8.2%11.9%
149.3%7.6%11.2%
159.6%7.9%11.7%

 

 

Retention of commenters using pro-opposition cues, with bootstrapped 95% confidence intervals, months are defined by the act of commenting, not consecutive in calendar order
 
Figure D3: Retention of commenters using pro-opposition cues, with bootstrapped 95% confidence intervals, months are defined by the act of commenting, not consecutive in calendar order.

 

 

Table D11: Retention of opposition-minded commenters engaged in discussion when interest in Navalny was high, with bootstrapped 95 percent confidence intervals.
MonthMeanLowerUpper
1100%100%100%
219%17.4%20.5%
316.4%14.8%18.2%
414.4%12.9%16.1%
511.7%10.3%13.1%
610.7%9.6%11.8%
710.7%9.1%12.6%
89.7%8.4%11.1%
910.5%9.2%11.9%
1010.8%9.1%12.9%
1110.1%8.8%11.4%
129%7.8%10.2%
137.8%6.6%9.1%
146.6%5.6%7.8%
156%5.1%7%

 

 

Table D12: Retention of opposition-minded commenters engaged in discussion when interest in Navalny was lower, with bootstrapped 95 percent confidence intervals.
MonthMeanLowerUpper
1100%100%100%
224.1%21.7%26.6%
319.3%17.2%21.7%
417.3%15.2%19.5%
517%14.9%19.3%
615.7%13.7%17.8%
715.9%14.2%17.9%
815.1%13.4%17%
914%12.5%15.8%
1013.6%12.1%15.3%
1113.8%11.9%15.9%
1211.5%9.9%13.4%
1312.1%10.6%13.8%
1412%10.5%13.7%
1511.4%9.9%13.1%

 

 

Appendix E: Identification of pro-goverment and pro-opposition discources for Hypotheses 1B and 1C.

To identify cross-cutting disagreement between authorities and opposition supporters, I employed a dictionary-based approach with a specific focus on derogatory references to Putin/authorities and Navalny/opposition. Since users are creative in insulting language and making name-calling and sometimes (un)intentionally make mistakes in pejoratives formed from the surnames of Putin and Navalny, major derogatory references were also combined with forms containing such obfuscations. The dictionary also has politically motivated hate speech stemming from the names of Russian propagandists, the Parliament, and the ruling party (‘United Russia’). When it comes to the camp of opposition-minded users, I included words signaling the link between dissent and the West, references to the Ukrainian revolution of 2013, etc.

The period for which comments were collected covered a wide temporal range, during which the conflict between Russia and Ukraine intensified. Accordingly, the words used on the Russian segment of the Internet to indicate support for one side or the other were checked. Some of these words were included in the final dictionary (for example, “укронацизм” (ukronazism), “рабсия” (rabsiya), while others were not (“хохол” (“khokhol”), “москаль” (“moskal”), “кацап” (“katsap”) and others). The latter are used equally in messages criticizing the Russian government and disapproving actions of the opposition and Navalny in particular.

In Internet discussions, the following words have weak discriminatory power to detect pro-government and opposition discourses and are not included in the keywords list: “нацпредатель” (“national traitor”), “сталинист” (“stalinist”), “враг народа” (“people’s enemy”), “мусора” (derogatory reference to police), “дед” (“granddad” — reference to Putin), “несогласные” (“dissenters”), “чайка” (“Chaika” — reference to the former Prosecutor General Yuri Chaika), “вождь” (“dux”), “поклонная (гора)” (“Worshipful Submission Hill” — reference to a place where pro-government rallies tool place in 2012), “Леонтьев” (“Leontiev” — reference to a Russian propagandist, press secretary of Rosneft corporation), “Кургинян” (“Kurginyan” — reference to Sergei Kurginyan, a pro-government journalist).

I keep two references to the editor-in-chief of RT media corporation Margarita Simonyan (“Симоньян”, “Симонян”) in the keywords list, because commenters associate her surname with the propaganda efforts of authorities and, unlike Kiselev and Solovyov, there are no other commenters with such a surname to whom other participants may refer (and, hence, distort the classification). The same is applied to the surname of Gabrelyanov (“Габрелянов”), the head of the media corporation Life, and Olga Skabeeva (“Скабеева”) who is a TV presenter and pro-government political commentator.

Table E1 contains major words to discern between pro-government and opposition stances. I do not present all possible distortions. The final version of the dictionary, which can be found in the replication materials, contained approximately 530 string patterns that were used to search for pro-government and opposition comments.

 

Table E1: Dictionary for identifying pro-/anti-government comments, for lower-cased text.
Derogatory words towards Navalny and oppositionDerogatory words towards Putin
“нассальный” (playing with the word “piss” and Navalny’s surname)“путлер” (playing with Hitler and Putin’s surname)
“навальнята” (small navalnys)“плешивый” (baldheaded)
“карнавальный” (carnival-ny)“путен” (playing with Putin’s surname)
“навральный” (playing with the word “liar” and Navalny’s surname)“путэн” (playing with Putin’s surname)
“наебальный” (playing with a rude word “lie” and Navalny’s surname)“путька” (playing with Putin’s surname)
“навальнер” (Navalner, reference to Hitler)“вован” (playing with Putin’s name)
“анальный” (playing with the word “anal” and Navalny’s surname)“путинизм” (Putin’s regime, Putinism)
“сисян” (playing with the word “boobs” — reference to a poor physical condition of Navalny)“вовка” (playing with Putin’s name)
“авальный” (playing with Navalny’s surname — Avalny)“вовчик” (playing with Putin’s name)
“овальный” (playing with the word “oval” and Navalny’s surname — Ovalny)“пукин” (playing with Putin’s surname)
“алешка” (derogatory treatment by name Alexei)“пукинизм” (playing with Putin’s surname)
“алёшка” (derogatory treatment by name Alexei)“пыня” (meme reference to Putin)
“брехальный” (playing with the word “brekhnya” or “bullshit” and Navalny’s surname)“ботокс” (botox — reference to plastic surgery made by Putin)
“нававльный” (deliberate distortion of the surname Navalny to offend him)“обнуленец” (a reference to “resetting to zero” the number of Putin’s presidential terms after 2020 change of the Constitution of Russia)
“нава” (deliberate shortening of the surname Navalny to offend him)“кабай” (a reference to the alleged marriage of Putin to Alina Kabaeva, famous gymnasts in rhythmic gymnastics)
“насральный” (playing with the word “shit” and Navalny’s surname)“солнцеликий” (“sun-faced”, meme reference to Putin, comparison with North Korean leaders)
“навальноид” (playing with Navalny’s surname)“карлик” (midget)
“навальнабот” (Bots of Navalny)“хуйло” (dickhead)
“навальнобот” (Bots of Navalny)“пугабэ” (playing with Putin’s surname, reference to Robert Mugabe, former president of Zimbabwe)
“новальнобот” (Bots of Navalny)“вовван” (playing with Putin’s name)
“новальнабот” (Bots of Navalny)“хуепутало” (playing with Putin’s surname)
“лехаим” (lechaim — from Hebrew “to life”, a drinking toast, but in the context of political discussions it refers to the antisemitic position of some Navalny critics)“пуйло” (dickhead, playing with surname and slogan from the footnote)
“провальный” (playing with the word “провал” — “failure” and Navalny’s surname)“запутинец” (“for Putin” — Putin’s supporters)
“альоша” (derogatory treatment by name Alexei)“бункерный” (bunker, reference to the situation during the coronavirus pandemic, when Vladimir Putin completely stopped communicating with the population directly and led the government online)
“обосральный” (playing with the word “shit” and Navalny’s surname)“хутин” (playing with Putin’s surname and the word “fuck”)
“дерьмократ” (playing with the word “democrat” and “дерьмо” [der’mo] — “shit”)“пуй” (playing with Putin’s surname and the word “fuck”)
“либераст” (playing with the word “liberal” and “pederasty”)“путиносос” (playing with Putin’s surname and the word “sucker”)
“фбкашный” (derogatory treatment by the abbreviation of Navalny’s organization, the Anti-Corruption Foundation)“лилипутин” (playing with Putin’s name and the word “lilliputin”)
“демшиза” (playing with the word “democrat” and “schizophrenia”)“моль” (moth, Putin’s nickname during his service in the KGB)
“анальнобот” (Bots of Navalny, and playing with the word “anal”)“лилипутан” (playing with Putin’s name and the word “hooker” (“путана”)
“фекальный” (playing with the word “excrement” and Navalny’s surname)“путя” (playing with Putin’s surname)
“хомяк” (hamster — derogatory treatment to those who criticize the government on the Web)“путиноид” (playing with Putin’s surname)
“хомячий” (hamster — derogatory treatment to those who criticize the government on the Web)“зомбоящик” (zombie box — reference to a state TV propaganda)
“либерд” (playing with the word “liberal”)“утин” (Utin — playing with Putin’s name)
“навалькин” (playing with the surname of Navalny)“хубло” (Hublo — playing with Putin’s name and the watch brand with insulting references)
“нахальный” (playing with the name of Navalny and a word “impertinent”)“колорад” (Colorado beetle — derogatory reference to the regime supporters)
“майданутый” (playing with a word “Maidan” (reference to revolution in Ukraine) and “fucked up”)“димон” (Dimon — reference to a former president Dmitry Medvedev)
“дновальный” (playing with surname Navalny and word “bottom” (“дно”)“плутин” (Plutin — playing with the surname of Putin and the word “rogue” (“плут”)
“болтальный” (playing with surname Navalny and word “talk” (“болтать”)“рассея” (Rasseya — one of the forms of playing with the country name Russia)
“горе-оппозиция” (pathetic opposition)“ура-патриот” (hurray patriot)
“навальнофил” (Navalnophil)“путинофил” (Putinophil)
“Сорос” (Soros — mentioning of Open Society Foundations of George Soros)“15 рублей, 85 рублей” (15 rubles, 85 rubles — reference to the practice of paid commenting when pro-government agents pay commenter 15 or 85 rubles per written messages)
“госдеп” (Gosdep — reference to the U.S. Department of State)“бобриха, боброедка” (beaver, beaver-eater — reference to the RT editor-in-chief Margarita Simonyan)
“укрофашизм, укрофашист, укронацизм” (ukrofashism, ukrofashist, ukronazism, ukronazist — playing with the root Ukr (reference to Ukraine) and names of ideologies)“лахта, ольгино, пригожинские, савушкино, еркю” (references to organizations involved in pro-government astroturfing campaigns)
“гейропа” (Gayrope — playing with Europe and the word “gay”)“киселевский, соловьевский, соловьиный, соловушек, скабеева, симонян, габрелянов” (references to state propagandists Dmitry Kiselev, Vladimir Soloviev, Olga Skabeeva, Margarita Simonyan, Ashot Gabrelaynov)
“популист” (populist — blaming Navalny for populism)“елбасы” (yelbasy — reference to the Kazakh official title “leader of the nation”, but commenters use it to refer to the Russian president because of resemblance to curse words in Russian)
“ципсо” (a reference to the organization working for pro-Ukraine political astroturfing)“срутин” (Srutin — playing with Putin’s surname and the word “shit”)
“белоленточник” (“white ribbon” — reference to the symbol of demanders for free and fair elections)“Пу, ПТН, Пыт” (playing with Putin’s surname)

 

 

Appendix F: Toxicity of comments for RQ 2.

 

Table F1: Example of comments and their toxicity scores according to Google Perspective API.
Toxicity scoreOriginal text in RussianTranslation
0.1беспредел откровенный под прикрытием государства!evident lawlessness under the cover of the state!
0.2а что удивляться, что одних воров заменили на других таких же, у них там, в опг, честных людей просто нет. за 20 лет их или убрали, или отняли, или просто отсеяли.and why be surprised that some of the thieves were replaced by others of the same kind, they simply do not have honest people there, in the organized crime group. in 20 years they were either removed, or taken away, or simply weeded out.
0.3сегодня утром открыл глаза и подумал, что это дерьмо происходит не на самом деле. такого не может быть в европейской стране 21 века.I opened my eyes this morning and thought this shit wasn’t happening. this cannot be the case in a European country of the 21st century.
0.4записывайте всех этих негодяев , не забудьте что они делалиwrite down all these rascals, don’t forget what they did
0.5алексей, а ты не думал, что этот вирус как биологическое оружие изобрел наш гарант, ну не он лично и под его указом, начали с китая, а потом и наших, а то мы твари расплодились не так, как надо имAlexey, did you think that this virus as a biological weapon was invented by our guarantor [President], well, not he personally, and under his decree, they started from China, and then ours, otherwise we creatures did not breed as they should
0.6чему удивляться, вся власть дебильная! им в дурдоме место в закрытой палате! голосуем дальше за них!why be surprised, all power is moronic! they have a place in a madhouse in a closed ward! vote further for them!
0.7навальный красава собирает явку для кремля а бараны пойдут проголосуют и обеспечит легитимность этим сраным властям!!!Navalny is the best, he is gathering a turnout for the Kremlin and the rams will vote and will provide legitimacy to these fucking authorities!!!
0.8навальный, что б ты загрёбся где....-хорошо бы было! достал дрянь брехливая!!!! лохи твои друзьяNavalny, what would you go to the graveyard where ....— it would be nice! I got the shitty stuff !!!! fuckers your friends
0.9овца тупорылая. блин, когда их всех поразит огонь людского гнева???blunt-nosed sheep. Damn, when will they all be hit by the fire of human anger???
1.0всех этих поповых да чаек, и всех их родственничков до седьмого колена не сажать надо, а вешать за ноги и пи@дить палками с гвоздями, пока не сдохнутall these priests and Chaikas, and all their relatives up to the seventh knee should not be jailed, but hung by the legs and beat with sticks and nails until they die

 

 

Toxicity in comments of Navalny’s YouTube and apolitical - Evening Urgant - channel
 
Figure F1: Toxicity in comments of Navalny’s YouTube and apolitical — Evening Urgant — channel.

 


Editorial history

Received 30 January 2023; revised 10 August 2023; accepted 18 September 2023.


Creative Commons License
This paper is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Navalny’s direct-casting: Affective attunement and polarization in the online community of the most vocal Russian opposition politician
by Aidar Zinnatullin.
First Monday, Volume 28, Number 10 - 2 October 2023
https://firstmonday.org/ojs/index.php/fm/article/download/12882/11342
doi: https://dx.doi.org/10.5210/fm.v28i10.12882