First Monday

Tracing normiefication: A cross-platform analysis of the QAnon conspiracy theory by Daniel de Zeeuw, Sal Hagen, Stijn Peeters, and Emilija Jokubauskaite



Abstract
This article presents a cross-platform analysis of the QAnon conspiracy theory that was popularized online from 2017 onward. It theorizes its diffusion as one of normiefication: a term drawing from Web vernacular indicating how ideas and objects travel from fringe online subcultures to large audiences on mainstream platforms and news outlets. It finds that QAnon had a clear incubation period on 4chan/pol/ after which it quickly migrated to larger platforms, notably YouTube and Reddit. News media started covering the online phenomenon only when it moved off-line, which in turn briefly amplified engagement on the other platforms. Through these data-driven insights, we aim to demonstrate how this cross-platform approach can be replicated and thus help make sense of the complexity of contemporary media ecologies and their role in the diffusion of conspiracy theories as well as other forms of mis- and disinformation.

Contents

Introduction
Normiefication
Methodology
Findings
Conclusion

 


 

Introduction

In the wake of Donald J. Trump’s election as U.S. president in November 2016, the influence of extremist and fringe online groups on the public debate has come under increased scrutiny. This includes the role of social media platforms and mainstream news outlets in facilitating and amplifying the viral spread of ideas from one digital sphere to another, at the same time as creating polarizing echo chambers that play into the existing confirmation biases of online groups (Benkler, et al., 2018). This combination of relatively insular group formation and the contrasting ability of ideas to easily travel beyond tightly-knit cohorts, as uniquely afforded by the Internet, has proven especially forceful. Some accounts even claimed fringe online subcultures related to the alt-right were the key to Trump’s victory (Beran, 2017) [1] by spamming “mainstream” online spaces with memetic imagery or gaming “trending” algorithms to whirl partisan content into a positive feedback loop (Marantz, 2019; Lagorio-Chafkin, 2018). Some of the most notorious tactics for “media manipulation” (Marwick and Lewis, 2017) in recent years arose from anonymous imageboards like 4chan and 8chan, whose blend of vernacular Internet humor and white supremacy proved effective enough to reach anywhere from mainstream news coverage (Phillips, 2018) to the manifestos of far-right terrorists (Cosentino, 2020). Even Trump himself circulated imagery born in the “chans”, for instance by retweeting an image of the infamous meme Pepe the Frog [2] and an edited picture of Hillary Clinton containing anti-Semitic iconography (Jacobson, 2016).

In addition to the mainstreaming of problematic memetic imagery is the amplification and normalization of conspiracy theories that first arose on fringe imageboards. One of the most notable conspiracy theories cooked-up in the chan-sphere, and that Trump indirectly yet frequently engaged with (Rupar, 2020), is Pizzagate. Surfacing in 2016, the theory claims that leading figures in the Democratic party are involved in a paedophilic child trafficking ring (Fisher, et al., 2016). Although it started as an obscure phenomenon on 4chan (Tuters, et al., 2018), the theory became well-known outside of the imageboard, spreading to other online venues like the conspiracy portal InfoWars, Breitbart News, Reddit, YouTube, Twitter, Facebook, and the news media.

While in isolation, Pizzagate may be considered merely another chapter within a longer history of the “paranoid style” in American political culture (Hofstadter, 1965; Muirhead and Rosenblum, 2019), the QAnon conspiracy theory it gave rise to testifies to long-term epistemic rifts and exposes “serious fault lines between different groups of citizens” (Rose, 2017). Framed as the“Big Budget Sequel to Pizzagate” (View, 2018), QAnon subsumes many conspiratorial elements already present in its predecessor, and came to act as an umbrella narrative for far-right conspiracy theorizing, fueling the anxieties that underpin them; from a fear of globalist, liberal, and Jewish elites to the erosion of traditionalist values [3].

The short version of QAnon goes as follows. In October 2017, an ominous poster started posting erratic and cryptic messages called “drops” on /pol/, 4chan’s infamous politics subforum. While 4chan is mostly anonymous, the poster used the “tripcode” functionality to add a persistent identifier to their posts based on a password. He, she, or they — we stick to the latter here — were referred to as “Q”, derived from the author’s claim to be a White House insider with “Q level security clearance”. Although hardly clear from Q’s vague messages, what soon developed was an exhaustive theory on how Robert Mueller (at the time investigating Russian meddling with the 2016 elections) was in fact working with Donald Trump to uncover crimes committed by high-profile Democrats — amongst even more outrageous claims. With a subset of 4chan users (“anons”, as they call themselves) professing to be merely “LARPing” [4] and Q claiming 4chan to be unsafe, the conspiracy theory disappeared from 4chan soon after it gained traction (OILab, 2018; Tuters, 2020). Ten months later, however, middle-aged and senior Trump supporters at a rally in Tampa Bay, Florida told a CNN supporter how Mueller was about to uncover the vast conspiracy and finally prove Q right. As the New York Times headed, QAnon has continued to “Seep from the Web to the Offline World” (McIntire and Roose, 2020), reaching a demographic of seemingly true believers far removed from the stereotype of the young, nihilistic, and always-online 4chan user, some of whom may well have engaged with the QAnon story merely for “the lulz” in adherence to the ironic early Web paradigm that “the Internet is serious business” — meaning that it is not (De Zeeuw and Tuters, 2020).

By 2020, the QAnon conspiracy theory is not only still alive, but actually flourishes, with dedicated Q-merchandise (Mak, 2018), a Super PAC (Gilbert, 2020), the election to congress of several “Q-Republicans” (Argentino, 2020; Rosenberg and Steinhauer, 2020), and connections to Trump and his administration on multiple levels (McIntire and Roose, 2020) [5]. Moreover, the 2020 COVID-19 pandemic has created a real “conspiracy boom” across the world, pointing the finger to Bill Gates, China, 5G, the deep state, and QAnon (Uscinski and Enders, 2020). In response, several major platforms have purged thousands of conspiracy accounts, including many QAnon-related ones. Notably, on 21 July 2020 Twitter removed over 7,000 QAnon accounts (Collins and Zadrozny, 2020) and Facebook vowed to ban Q-dedicated groups altogether in October 2020 (Statt, 2020).

QAnon’s unlikely diffusion prompts a question this article tries to answer empirically: How did QAnon change from being an obscure and partly satirical conspiracy theory on the fringes of the Web to a partisan trope in the Trump movement attracting many genuine conspiracists, and widely reported upon in the mainstream news? To study this, we construct a record of cross-platform mentions of QAnon, comparing data from 4chan, 8chan, Reddit, YouTube, Breitbart, and online news media. Additionally, by outlining a generalizable cross-platform approach to capture posts containing specific keywords, the article also offers a methodological template for similar comparative research.

In light of concerns over the growing influence of fringe platforms and polarizing actors on mainstream platforms like YouTube, as well as the news media’s role in amplifying both (Phillips, 2018), the larger purpose of this case study is to better understand this dynamic whereby initially subcultural objects travel from fringe platforms across different Web spheres to reach networked publics unfamiliar with their original (sub)cultural context. We describe this specific process as “normiefication”. In the next section, we first develop this term, relating it to existing “diffusion” and “refraction” frameworks. After that, we describe a cross-platform protocol to trace the mentions of the QAnon conspiracy theory. We then present the key findings of the case study and discuss several theoretical takeaways on the interrelated roles of various Web spheres in the diffusion of online conspiracy theories.

 

++++++++++

Normiefication

Having passed beyond the early utopian musings over the inherently democratising capacities of the Internet in the early 2010s (Morozov, 2009), an increasing volume of work explores how the Internet can be maliciously manipulated by various groups and individuals (Benkler, et al., 2018; Phillips, 2018; Marantz, 2019). Supposedly, we entered an era of “polarization, distrust, and self-segregation” (boyd, 2017) dominated by Internet trolls, hate groups, ideologues, influencers, hyperpartisan news outlets, populist politicians, and conspiracy theorists (Marwick and Lewis, 2017). Part of these studies on “media manipulation” deal with the growing influence of the “fringe” on the “mainstream”. The QAnon conspiracy theory seems a typical example of such influence.

Defined as “the outer, marginal, or extreme part of an area, group, or sphere of activity,” [6] the fringe is traditionally something that does not conform to societally dominant ways of speaking, knowing, and doing, and as such is removed from the center, or what is known in the context of mass media as “the mainstream” (Chomsky, 1997). However, initially fringe ideas can sometimes enter the mainstream and eventually become dominant. This process is nothing new; the question on how big social change can be traced to small beginnings has been an issue from the microsociology of Gabriel Tarde (1903) and research on the “diffusion of innovation” (Rogers, 1962) to epidemiological perspectives on how, enabled by digital networks, cultural objects can become “viral” (Sampson, 2012). However, a more timely question is to what extent decentralized networks come to redraw the boundaries, or even undermine the distinction, between fringe and mainstream, including their “anchoring” in a specific medium or ideological discourse. If it still makes sense to distinguish between fringe and mainstream in the context of digital networks, it must be in a way that acknowledges their highly volatile and hybrid character. From this angle, fringe and mainstream should not be considered as relatively stable antagonistic formations but ephemeral constellations of “networked hypes” in constant flux.

Emblematic for this fringe-mainstream dynamic and its potential collapse in digital networks is the ubiquitous presence of initially niche, born-online conspiracy theories. In its pejorative meaning, a conspiracy theory is a fringe phenomenon per definition, in the sense that it projects theoretical models for explaining reality that are not accepted by the dominant scientific and political institutions (Harambam and Aupers, 2015). Yet conspiracy theories may pass through a process of public acceptance whereby they are no longer perceived as conspiratorial in the pejorative sense. While not long ago the “less legitimate media sources” drawing in new conspiracy thinkers were “tabloids, Internet blogs, and radio talk shows” [7], in the 2010s this intermediary role has been taken up by platforms like YouTube, where individual conspiracists grow large online followings. Similarly, imageboards and subreddits have proven fertile environments for false claims later covered by the mainstream, as in the case of the misidentification of the Boston Marathon Bomber in 2013 on Reddit (Potts and Harrison, 2013) and the aforementioned Pizzagate conspiracy theory on 4chan in 2016 (Tuters, et al., 2018).

We understand such forms of conspiracy propagation as part of a general process of “normiefication”. More than general concepts like “normalization”, “normification” (Preist, et al., 2014), or “mainstreaming”, normiefication specifically concerns the online diffusion of “born-digital” cultural artefacts that first come to fruition within fringe online subcultures and later find a larger and more dispersed mainstream audience, like in large Facebook groups, newspaper articles, on television, or in tweets by established politicians. The concept of normiefication as a specific form of online diffusion across culturally distinct Web spheres, we claim, may prove to be of critical heuristic value for understanding when and how subcultural and deeply vernacular tropes — including memes and obscure conspiracy theories — make their way to more mainstream online spaces and even become household names.

It is important to note that, unlike normalization, normiefication does not necessarily signify a wider acceptance of fringe ideas in the sense that more people actually take them to be true. Looking purely at the normiefication of QAnon across different platforms says little on the question of whether belief in the conspiracy also grew. For example, a brief qualitative exploration of our data shows that references to QAnon in the news media are of a categorically different kind than those on 8chan: rather than expressions of belief in Q, the news media report on Q believers, e.g., as somewhat gullible or motivated by populist resentment.

To account for such contextual differences in how a term like QAnon is used across spheres, we have to distinguish between the cultural logic of a “deep vernacular Web” typically found on sites like 4chan (De Zeeuw and Tuters, 2020), mainstream social media platforms like Facebook and Instagram, and (the online presence of) more traditional news outlets like the Washington Post and CNN. This tripartite distinction between different Web spheres is anchored in the different affordances and practices that characterize them. For example, whereas interaction on deeply vernacular platforms like 4chan is ephemeral and anonymous, and trolling is expected, on social media platforms like Facebook all posts are linked and stored in a personal profile, and behavior is expected to be authentic. And compared to these platforms, news outlets have a more traditional and unidirectional “broadcast” purpose.

The boundaries separating these different Web spheres are of course rarely rigid, and they cannot be strictly identified with particular platforms. For example, on Facebook there exist subcultural meme groups that abide by the axioms of anonymous online culture, just as on 4chan there exist more mainstream social media practices. Nevertheless, a historical affinity persists between anonymous imageboards like 4chan and the logic of the deep vernacular Web, just as, conversely, platforms like Facebook tend toward more “authentic” real-name practices. Such affinities have to do with how differences in technological infrastructures and their particular affordances (e.g., anonymity and ephemerality on 4chan compared to persistent personal identity on Facebook) yield different social practices and cultural imaginaries. Indeed, normiefication also touches on how the differences between Web spheres are collectively imagined by their users. Whereas anons see themselves as different from the mainstream — associated with platforms like Facebook — users of Facebook may conceive of subcultural spaces like 4chan as fringe and “dark”. Notably, on 4chan, “normie” specifically refers to these “regular” users who are not familiar with the latest subcultural online trends. In this vernacular sense, normiefication is often lamented by subcultural early adopters; Urban Dictionary for instance defines the related term “normification” as “the act of killing a meme by making it a normie meme” [8]. This antagonistic relationship to the mainstream is inherent to the coherence of subcultures (Hebdige, 1979), and likewise the (imaginary) presence of normies as mainstream outsiders can be critical in the formation of online subcultures (Literat and van den Berg, 2019). Normiefication is thereby consonant with subculture studies, which looks at “how members of a collective discursively cast themselves as antithetical to, apart from, or in opposition to a nebulous discursive ‘mainstream’” [9]. While we acknowledge “normie” has a derogatory meaning, the term’s reflection of antagonisms between distinct online spheres helps to account for cases of online diffusion where such imaginaries actually affect the way cultural objects spread.

What currently defines the subcultural milieu of 4chan/pol/ is a combination of, on the one hand, a commitment to irony, trolling, and other forms of transgressive play that characterizes the “mask culture” of the deep vernacular Web (De Zeeuw and Tuters, 2020) and, on the other hand, an increasingly extremist and violent community of far-right agitators. Whereas some anons still pledge allegiance to “Poe’s law”, an Internet adage stating nothing on the Web should be taken seriously, many anti-Semitic and racist users of 4chan/pol/ take their role as propagandists for their white supremacist cause very seriously (even when to journalists they may purport to be merely trolling). In fact, it is the elliptical “sliding” of one into the other that has proven seminal to the emergence of the alt-right, where, to paraphrase one of its main proponents, actual Nazism often masquerades as ironic Nazism (O’Brien, 2017). ). It is within this quaint subcultural milieu that QAnon originated, and whose subcultural elements partly continued to resonate as the phenomenon spread to other platforms in recent years. Before the site-wide ban, Facebook and not 4chan was thought to provide the main platform for the QAnon conspiracy community, with groups reaching hundreds of thousands of subscribers (Argentino 2020; Wong, 2020).

Normiefication, then, signals a process whereby a cultural object originating in an online space that adheres to the logic of the deep vernacular Web — understood as a collective understanding of, and commitment to, anonymity, irony, play, and ambivalence — reaches a larger audience outside of its native subcultural context, e.g., in spaces on mainstream platforms governed by other networked publics and rules of engagement. It is this distinction between different Web spheres that informs our cross-platform analysis of QAnon.

Focusing on the online spread of the conspiracy theory, our study aligns with “diffusionist” studies, which consider how information “travels through” individuals and media technologies as distinct “units”. As Rieder (2012) notes, this outlook is embedded in classical theories at the foundation of communications and media studies, like the two-step model of communication (Katz and Lazarsfeld, 1955), work on the “diffusion of innovations” (Rogers, 1962), as well as newer fields like memetics (Blackmore, 1999). The emphasis on diffusion fits well with studies attempting to quantify communication flows, including the prominence of certain narratives across platforms and the influence of those propagating them [10]. Web platforms have made diffusion studies easier to operationalize by presenting a deluge of data, as well as concretizing distinct “information units” and traceable objects like hashtags or @-mentions. Moreover, recent computational methods of “network diffusion” have presented advanced models for tracing the propagation of various phenomena online, from following the presence of misinformation (Del Vicaro, et al., 2016; Shin, et al., 2018) and marketing messages (Li and Shiu, 2012; Rogers, et al., 2012) to measuring the effects of the mainstream media on opinion dynamics (Quattrociocchi, et al., 2014), among others. In light of recent concerns on media manipulation, such diffusionist outlooks have also informed policy-focused research, e.g., studies dealing with “network contagion” and “memetic warfare” (see Goldenberg and Finkelstein, 2020).

However, diffusionist approaches have been criticized for overlooking the non-linear and bi-directional ways information units travel (Chabot and Duyvendak, 2002). Acknowledging and in accordance with this criticism, we found QAnon also lacks a clear “pathway” from one sphere to another. Another criticism of diffusionist approaches is their failure to account for the way information units change as they move from one context to the other. Studying cross-platform variations of how a cultural artefact discursively takes shape might thus be better served with what Rieder sees as studying “refraction” (2012) [11] or with studies on circulation and rhetoric that are sensitive to contextual enunciations and materialities (Gries, 2015). In the case of QAnon, a refraction analysis would look at how the use and meaning of QAnon differs across spheres, e.g., from trolling on 4chan/pol/, to accepted conspiracy dogma on 8chan, to news reports on CNN. Our study acknowledges and redresses this criticism by analyzing the diffusion of QAnon across platforms within a qualitative framework of normiefication, which highlights the role of diverging cultural logics (and their historical affinity with the affordances of certain platforms) in shaping the discursive contexts that in turn shape the way terms are used, understood, and spread, emphasizing the fluctuation and material co-construction of cultural phenomena across digital environments.

That said, our empirical research does not closely analyse how the contextual meaning of QAnon refracted across Web spheres, e.g., seeing if QAnon is mentioned jokingly, in earnest, or neutrally as a newsworthy phenomenon. By favouring a more distant comparison of relative trends, we are nevertheless able to show how Q mentions of any kind may draw in new actors in the conspiracy community, independent of the context or intentions of journalists and users. Whitney Phillips (2020) argues against overstating the relevance of contextual meaning with regard to the influence of conspiracy theories online, claiming that “the information ecology doesn’t give a shit why anyone does what they do online. Sharing is sharing is sharing”. in the same vein, Phillips and Milner (2021) emphasize how numerous news articles were an important factor in “supercharging” QAnon’s presence, even when debunking the conspiracy theory. It suggests that even when QAnon is reported upon by the news media as an absurd idea, or jokingly mentioned by a non-believer on Reddit to harvest karma, it thereby still accrues visibility and attention, which is what, following Phillips and Milner (2021), the current media landscape is all about. It is with this “memetic” or “spectacular” media logic (Mihailidis and Viotty, 2017) in mind that our quantitative cross-platform analysis empirically assesses how media coverage on QAnon aligns with engagement elsewhere on the Web.

 

++++++++++

Methodology

Methodologically, this study hopes to contribute to the relatively under-developed field of cross-platform research (Hall, et al., 2016). To cast a wide net on Q-related discussion online, we collected data from 4chan, 8chan, Reddit, YouTube, Breitbart, and online news media, whose different data collection steps we briefly detail below. For all of these, we collected data by searching for Q-related keywords. Instead of using hyperlinks, the cross-platform analysis here thus follows Elmer and Langlois (2013) to connect heterogeneous spheres of online communication through their common use of keywords. To facilitate comparison, the types of data objects to be collected (comments, posts, videos, etc.) were chosen primarily on the basis of whether they could be seen as part of the “comment space” of the respective platform (see Table 1). However, as a well-known hurdle for cross-platform analysis, collecting and comparing “equal” units is often impossible because of technical limitations, inherently different platform ontologies, and differing “device cultures” (Weltevrede and Borra, 2016; Elmer and Langlois, 2013; Rogers, 2017). Considering this, we understand the chosen units as indicative of the particularities of, and asymmetries between, various platforms, but such that it still allows us to compare relative trends.

 

Table 1: The selected platforms, subselections, their data objects, and the amount of units collected.
PlatformSubselectionData objectsAmount
4chan/pol/ “Politically Incorrect”Posts and comments74,015
8chan/qresearch/ and other smaller boardsPosts and comments203,011
RedditSelected subreddits (appendix II)Comments177,449
YouTubeVideos in EnglishVideos5,352
Breitbart-Article comments16,254
Online news mediaArticles in English mentioning QAnonArticles458

 

Since we are interested in the process of normiefication from Q’s beginnings on 4chan in October 2017 to its CNN appearance in August 2018, we settled on a timeframe of roughly one year: from Q’s first “drop” on 28 October 2017 to 1 November 2018. To accommodate comparisons, a general “core” search query was written for all platforms. Simply querying “Q” or “QAnon” resulted in too many false positives, so a selection of universal Q-related keywords (e.g., “Q-clearance”) and exclusions (e.g. “Q & A”) was made (see Appendix I). The core query was used for all platforms except YouTube and online news media. The effectiveness of the core query was tested by taking a random sample of 200 matching comments. To filter out false positives, the sample comments were manually coded as either relevant or irrelevant to QAnon. If the accuracy of the sample fell below 90 percent, the core query was modified for the respective platform to exclude false positives and used again with a different 200-comment sample. This helped filter out phrases like “H I G H Q U A L I T Y” common on Reddit. After some iterations, all datasets had at least 94 percent true positives in the random samples.

An obvious place to start our data collection was that of QAnon’s inception: 4chan/pol/. To collect Q-mentions, we used 4CAT, a tool that retrieves data from various platforms including all posts on /pol/ from 2014 onwards (Peeters and Hagen, 2018). Using the adjusted core query, we collected 73,989 Q-related posts and comments within the timeframe. After 4chan/pol/, Q themselves posted on an even less regulated and more toxic imageboard: 8chan. In contrast to 4chan, we could not rely on a comprehensive 8chan archive for data collection. However, various grassroots archives of specific Q-related 8chan posts do exist, of which Qanon.news is the largest and seemingly most complete. For this reason, we queried for relevant posts in its collection of Q-authored and Q-related posts. These posts are mostly from the board /qresearch/, a board dedicated to QAnon discussion. Since Qanon.news does not provide a description of its data collection method, the completeness of the data cannot be fully verified. To the best of our knowledge, it contains most of the user discussion on the archived boards, but misses some data when the Q phenomenon was still relatively new and grassroots archiving efforts were only just emerging. Nonetheless, the archive comprises 6,893,833 posts and comments within the relevant period, of which 203,011 made direct unambiguous references to Q.

Since Reddit was posed to be a central hub for early Q-talk (Zadrozny and Collins, 2018), we collected relevant data from the platform. To make the comparisons to 4chan, 8chan, and Breitbart, we decided to collect comments instead of posts. These were retrieved from the Pushshift database, containing nearly all Reddit posts and comments since 2005 (Baumgartner, et al., 2020). Due to a large volume of false positives, we used the core query to select subreddits that had 50 or more matches in at least one month within the timeframe. The resulting list of 246 subreddits was then filtered manually, removing subreddits generating a large amount of false positives because of unrelated topics (e.g., r/math) or sheer size (e.g., r/funny). The remaining 54 subreddits (Appendix II) were then queried for all QAnon-related comments, using a slightly modified core query [12]. With this, we retrieved 177,449 comments. Of note is that the Pushshift dataset periodically misses comments from banned subreddits [13], causing large gaps for the Q-specific subreddits r/CBTS_Stream and r/greatawakening [14]. These gaps are indicated in the relevant data visualizations.

YouTube is another place where QAnon discussion was present during the conspiracy theorys early days (Zadrozny and Collins, 2018). Although YouTube comments formed a logical parallel to the other platforms, we opted to collect videos instead: they are “authoritative” units in the presence of Q on YouTube, and there is no straightforward way to query a representative sample of comments. To collect data on Q-related videos, the search endpoint of YouTube API was used on 2 December 2019 [15]. Although the search endpoint roughly corresponds to an unpersonalized search in YouTube’s Web GUI, its exact inner workings are quite black-boxed [16]. To minimize the dependency on YouTube’s opaque video selection, multiple queries were executed using different keyword combinations [17]. The results were then merged and manually filtered. English videos were prioritized in the query and non-English videos were deleted [18] to align with the Anglophone data from the other spheres. After merging the search results and deleting duplicate entries, false positives were removed [19]. This resulted in 5,352 videos from 868 channels.

Because of its status as a major outlet within the right-wing news space, we also collected data from the Breitbart News comment sections. These sections hypothetically acted as an intermediary, drawing in new actors and pushing the QAnon narrative further into more mainstream channels. To first retrieve a complete list of articles, we first ran an exhaustive crawl of Breitbart News. The article pages collected were then filtered on whether they were published within our timeframe [20]. For all collected articles, the Disqus API was used to retrieve their comments. Q-related comments were then filtered using the core query adjusted through iterative sampling, yielding 16,254 comments in total.

Finally, we collected online news articles mentioning QAnon to study if, where, and when the conspiracy theory garnered journalistic attention. To gather these articles, we tested multiple APIs that offer access to journalistic data [21]. Ultimately, a combination of Nexis Uni and Contextual Web Search was used: the former offers access to many sources locked to other APIs (e.g., Washington Post), while the latter returns online-only sources not appearing in Nexis Uni (e.g., Vox). Both data sources were queried for articles within the timeframe whose body text or title contained “QAnon” or any derivations thereof (e.g., “Q-Anon”, “Q Anon”) [22]. An English language filter was additionally set for Nexis Uni. From the combined set of 1.294 articles, we manually removed sources that did not meet our definition of “online news media” [23]. The news sources we kept included large news Web sites, larger specialist Web sites, regional news, and non-Anglophone Web sites with content in English. Identical articles reposted by different outlets were kept, since these were seen to indicate the diffusion of the conspiracy theory [24]. The resulting dataset includes 458 news stories from 183 sources.

With the six resulting datasets [25], we visualized the QAnon-references per day to compare their relative platform presence. For Reddit, YouTube, and the online news media, we also made interactive beehive graphs where the units were sorted by subreddits, YouTube videos, and news outlets, and which were then used to analyze the diffusion of QAnon within these spheres. The platforms selected thus cover key parts of the “normiefication” spectrum, from fringe (4chan and 8chan) to mainstream (online news media). Unfortunately, whereas Facebook and Twitter were (and continue to be) key platforms for Q-related discussion, collecting a comparable, large enough sample from these platforms was untenable [26]. However, a report by Gallagher, et al. (2020) does track trends in Q-related activity on Facebook and Twitter from the same timeframe as ours, whose findings we also mention here.

 

++++++++++

Findings

Several general takeaways are immediately visible from the trends in Figure 1. Discussion of Q indeed initially concentrated on 4chan, before becoming prominent on a variety of other platforms. It only reached widespread journalistic coverage when the conspiracy theory “went offline” at Trump’s Tampa Bay rally on 31 July 2018, which itself seems to have briefly amplified activity on every other platform. Another key insight is that while its diffusion has a clear beginning on 4chan and clear subsequent — but delayed — rise in mainstream recognition, the path of QAnon from one platform to the other is all but linear. Rather, after an initial incubative flurry of discussion on 4chan, Q-related content quickly emerged on Reddit, YouTube, and Breitbart. Activity then waxed and waned following general and platform-specific events, which we detail below. These trends generally align with activity on Twitter and Facebook; especially Twitter was quick to engage in QAnon-talk after it first appeared on 4chan/pol/, and activity for both platforms spiked in the aftermath of the Tampa Rally (Gallagher, et al., 2020).

 

QAnon-related activity across multiple Web spheres between 28 October 2017 and 1 November 2018
 
Figure 1: QAnon-related activity across multiple Web spheres between 28 October 2017 and 1 November 2018.

 

Notable here is that 8chan, generally characterized as a safe space for “abhorrent racism, violent misogyny, and rampant anti-Semitism” (McLaughlin, 2019), and that one would intuitively group with its sibling imageboard 4chan, does not follow the latter’s activity pattern. Rather, the archives used here show 8chan only became a nexus of Q-discussion after 4chan activity waned. YouTube and Reddit, in contrast, picked up on Q almost as soon as they started posting “drops” in late October 2018. While we cannot make any direct claims on why users were incentivized to visit the platforms under scrutiny, and some early 8chan data might be missing, our available data does align with the characterization of 8chan’s Q-debaters as an audience that “started to head to 8chan to check out the original source and interact directly with the [Q] posts” (Zadrozny and Collins, 2018) after they had already encountered it on other platforms like Twitter, Facebook, and YouTube.

Also notable is that the agenda-setting capacity of news media is still significant, with articles in legacy media like the Washington Post and CNN correlating with significant spikes in activity on all platforms we study. This spike is most prominent on the news-driven space Breitbart and the sensationalist 4chan/pol/. The latter can be read as a testament to Phillips’ (2018) claim that subcultural and toxic spaces like 4chan are energized by mainstream news coverage. However, in our case study, the journalistic “oxygen” provided to the conspiracy theory only lasted a short time; discussion volumes quickly returned to normal. This prompts the question whether journalistic amplification of conspiracy theories and other forms of toxic discourse has any long-term effects.

Moreover, while QAnon’s proliferation seems to be stimulated by mainstream news media attention, it did not rely on it. Rather, QAnon discussion flourished on all platforms long before the media spike in early August 2018, and even before public figures like the American television celebrity Roseanne Barr got involved in late March 2018. This is not to say public individuals did not amplify discussion in the first months (as we will for example see for YouTube below). However, in our data we rarely see surges in activity aligning with actions by major figureheads, celebrities, or media icons. Rather, prior to mainstream prominence, grassroots communities and conspiratorial micro-celebrities on YouTube (Lewis, 2019) already propagated and sustained discussions on QAnon. YouTube and Reddit are particularly important platforms in this regard. As alluded to above, this “independent” activity might be seen as indicative of an epistemic rift between the established mainstream news sources and the online fringe, operating largely under the radar of widespread public attention. For example, within their own epistemic culture, actors like the Q-believer and YouTuber PrayingMedic (375k subscribers as of August 2020) are probably more authoritative than the New York Times, forming an “alternative establishment” to their respective audiences [27]. Below we flesh out the most important, but more platform-specific, findings.

Q-native hotspots: 4chan and 8chan

Activity on 4chan was highest right after Q’s first appearance. This is mostly caused by discussion of “drops” in QAnon-specific “general threads” titled /CBTS/ (for “calm before the storm”) [28]. While these threads offered room for a dedicated and serious Q audience, other posts take the persona and its related theories less seriously, referring to it as a “LARP” [29], emphasizing the mix of genuine paranoia and nihilist satire 4chan is known for. 4chan activity decreased sharply after Q started posting on 8chan instead, whereas 8chan itself naturally became more prominent after this happened. Although 4chan’s moderators never made any public announcement concerning Q, platform governance may have had an impact: several “drops” were deleted by 4chan moderators around the time Q moved to 8chan [30] and QAnon was subsequently said to be “permabanned” by its adherents (Figure 2). Another possible reason for the sudden drop in 4chan activity is that /pol/ users uncovered the unique code with which Q could identify themselves (#Matlock), leading others to troll and clutter the communication. With its possibility to create custom boards, 8chan formed a suitable alternative for collective QAnon conspiracy theorizing.

 

An opening post to a Calm Before the Storm general thread on 4chan/pol/ announcing the move to 8chan
 
Figure 2: An opening post to a Calm Before the Storm general thread on 4chan/pol/ announcing the move to 8chan. Screenshot captured from 4plebs.org.

 

Although 8chan’s low initial activity might partly be caused by the data’s incomplete nature, it matches reports stating that 8chan was only later “discovered” by conspiracists after initial encounters with the QAnon theory on YouTube (Zadrozny and Collins, 2018). From February 2018 onwards, activity remains consistent, with between 5,000 and 1,000 explicitly mentioning Q per day. For a seemingly niche space like 8chan, these are high numbers, with the total amount of posts (203,011) forming the largest of all collected datasets — even exceeding that of Reddit (177,463), a Web site with 430 million active users (Perez, 2019). Together, the findings thus underline 8chan as a platform with an extremely active “hard core” of QAnon supporters.

Bridge platforms? YouTube and Reddit

Activity on both YouTube and Reddit started shortly after the first posts by Q on 4chan. Figure 3 shows Reddit comments grouped by subreddits. From this view, we distinguish two categories of subreddits: those dedicated exclusively to QAnon discussion, and more broadly themed subreddits where QAnon only forms one topic. Notably, the latter category was where Q-related discussion initially popped up after spilling over from 4chan, particularly in r/conspiracy, a subreddit for discussion on conspiracy theories, and r/The_Donald, a pro-Trump subreddit. Roughly six weeks after Q’s first sign of life, the dedicated subreddits r/CBTS_Stream was created, which quickly became a large Q hub. It was claimed the creation of and move to r/CBTS_Stream was “key to Qanon’s eventual spread” since it allowed reaching “a larger audience of conspiracy theorists” (Zadrozny and Collins, 2018). After r/CBTS_Stream was banned for “incit[ing] violence and the posting of personal and confidential information” (Wyrich, 2018), r/greatawakening was created, which quickly turned into a space for high-traffic discussion. Both casual and hardcore users of the subreddit overlapped with r/conspiracy and r/The_Donald (Chang, 2018), emphasizing the networked nature of Reddit’s communities. r/greatawakening itself was also banned on 12 September 2018, after which Reddit-wide activity significantly decreased. However, shortly after the Tampa rally a wider range of large subreddits such as r/politics (having almost four million subscribers at the time [31]) started to engage with Q, indicating a process of normiefication from dedicated conspiracy subcultures to general and large audiences within Reddit.

 

Static beehive graph of QAnon comments on Reddit
Figure 3: Static beehive graph of QAnon comments on Reddit; for an interactive version, click here. Grouped per 25 comments and colored per subreddit. Data derived from Pushshift. Visualized with RAWGraphs (Mauri, et al., 2017).

 

Like Reddit, YouTube is a large platform for Q-believers and sceptics alike. The video service has been criticized for being a “Great Radicalizer” (Tufekci, 2018) and a prominent platform for conspiracy videos by amateur filmmakers (Marwick and Lewis, 2017). When it comes to QAnon, the video platform indeed forms a highly active space, with 5,352 videos garnering 122,107,633 views as of December 2019 [32]. One can distinguish a similar pattern to Reddit, where activity, especially initially, is dominated by highly conspiratorial, pro-Trump channels (Figure 4). The first entries also represent a clear spillover from 4chan, with videos directly covering Q’s activity on /pol/ by discussing screenshots of their posts. Notably, the second video in the dataset is a commentary on Q’s 4chan activity by Tracy Beanz, a small-time YouTuber who, as NBC uncovered, actively coordinated with two 4chan moderators to bring QAnon to more mainstream audiences — not in the least for monetary incentives (Zadrozny and Collins, 2018). As the NBC report notes, it was the same trio that also created the subreddit r/CBTS_Stream, which as we saw grew to be an incredibly active Q-hotspot on Reddit. Whether following from or growing independently alongside this coordinated activity, the YouTube videos after this first period remain overwhelmingly pro-Q (e.g., Patriot Hour with 335 videos, JustInformedTalk with 234, and SpaceShot76 with 218), with videos extensively explaining the conspiracy theory, informing their “trust-seeking” viewers of the latest Q-updates, and advocating a dismissal of the “Fake News Media” in favour of “Trusting the Plan”. Despite their arcane messaging, they feature an informal, “personal” vlogging style, for instance by older religious Americans like PrayingMedic (375k subscribers as of August 2020), in stark contrast to the anonymous, collective, and often ironic engagement with QAnon on 4chan. From June 2018 onwards, however, more professionally produced Q-videos seem to function as collectivizing units, like the cinematic “Q: The Plan to Save the World” (1.3 million views) and “Q — We Are the Plan” (1.6 million views). The polished aesthetic welcoming explainers, and massive view count of such videos underline the normiefication of QAnon since its niche subcultural beginnings on 4chan/pol/.

 

Static beehive graph of QAnon comments on Reddit
Figure 4: Static beehive graph of QAnon videos on YouTube; for an interactive version, click here. In the interactive version, click to go to the video. Sized per video views and colored on availability on 22 October 2020. Visualized with RAWGraphs (Mauri, et al., 2017).

 

Finally, similar to Reddit, attention from large, mainstream YouTube channels, like those of established news outlets and celebrities, only start appearing after the Tampa Bay Rally. This was largely kickstarted by CNN’s video “CNN reporter talks to conspiracy theorists at Trump rally”, posted on 2 August 2018. Afterwards, videos by established media figures like Jim Jeffries and Bill Maher cover and ridicule the alleged conspiracy. However, the mainstream attention did not seem to affect the grassroots, pro-Q content production, which steadily continues in late-2018. Q’s proliferation on YouTube thus indicating a comfortable co-presence of believers and skeptical commentators.

It remains a question whether, following Phillips’ (2020) remarks on the irrelevance of how a topic is discussed vis-à-vis its diffusion rate and attention garnering properties, even sceptical or ridiculing QAnon activity contributed to its normiefication. This question is especially pressing regarding the intermediary role of YouTube and Reddit. While 4chan and 8chan are well-known conspiracy incubators (Tuters, et al., 2018), they are still quite isolated from online information ecosystems because of their ephemerality, limited indexation by search engines, and resistance to commercialization and growth that often lead to API-retrieval by external sources (Helmond, 2015). In contrast, the large volumes of Q-discussion on Reddit and especially YouTube seem more centrally connected to other spheres. For instance, Gallagher, et al. (2020) found that 20.4 percent of QAnon-related Facebook posts contained a link to a YouTube video, while only a few of the YouTube videos directly link to 4chan or 8chan. Moreover, their content can be actively pushed to large audiences by the platform’s ranking systems and algorithms, notably through Reddit’s automatic inclusions in overview pages (r/popular and r/all) and YouTube’s search and recommendation system. We can only speculate that YouTube’s recommendation algorithm cannot always distinguish between different kinds of QAnon activity, leading to an algorithmic entwinement of their publics. While we do not investigate algorithmic amplification as a factor, some evidence suggests YouTube’s recommendations contributed to a “conspiracy boom” in 2018, including QAnon [33]. In any case, it is safe to state that in 2018 Reddit and YouTube were major enabling factors in the normiefication of QAnon long before its widespread appearance in news articles.

News spheres: Breitbart and online news media

While editorial content on Breitbart does not directly mention Q, except in quotes or derisively, the comments on the site show a different picture. Similar to Reddit and YouTube, activity in Breitbart’s comment section picks up soon after the initial flurry of Q-discussion on 4chan. Interestingly, this characterizes Breitbart’s comment section as acting somewhat independently from Breitbart’s editorial content. After the first months, Q-comments are sustained, until it greatly increases with the coverage of the Tampa Bay rally. Here it is difficult to distinguish various publics, but notably the general activity trend follows that of YouTube and Reddit, though with a more prominent spike around the Tampa Bay rally. This suggests that the platform is affected by mainstream news coverage of the topic more strongly than others, which stands to reason as Breitbart is a news medium itself, albeit a (hyper)partisan one.

By far the most prominent spike in activity around the Tampa rally is seen with the online news media. Before, Q was mentioned intermittently and circumstantially, e.g., via an embedded tweet, or in the context of Roseanne Barr, who in March 2018 tweeted pro-Q statements, which subsequently led to the cancellation of her TV show. The small amount of articles on Barr and the lack of engagement with her tweets in other spheres seem to invalidate Phillips and Milner’s claim that news articles about her tweets “supercharged” QAnon’s presence by “propell[ing] the theory well beyond Barr’s personal reach” [34]. Instead, our data indicates it is only when QAnon “moves offline” with the Tampa rally that the majority of the reporting starts and that this seems to propel engagement in other spheres. Starting on 1 August 2018, right after the CNN report on the rally, articles headline how a “deranged conspiracy cult leaps from the Internet to the crowd” (Stanley-Becker, 2018). Following this, from 1 to 3 August, 91 articles in the news dataset were published, most frequently by the Washington Post (Figure 5). Besides amplifying discussion across the platforms included in our study, the Tampa coverage also spiked on Twitter and Facebook in early August 2018 (Gallagher, et al., 2020) [35]. Reports after the Tampa event, also seem primarily driven by the off-line manifestation of QAnon. Notably, several articles covered how Trump met the radio personality and leading QAnon supporter Michael William Lebron in the White House in late August (Feldscher, 2018) and how QAnon was related to two U.S. domestic attacks (Roose, 2018).

 

Static beehive graph of news articles mentioning QAnon
Figure 5: Static beehive graph of QAnon videos of news articles mentioning QAnon, separated per article and colored by the most-occurring sources; for an interactive version, click here. In the interactive version, hover over the nodes to see the article headline and source.

 

Platform governance

With QAnon continuing to flourish in late 2020, platforms like Facebook are now proactively moderating and banning Q-related activity, following aggressive steps by Twitter (Edelman, 2020). However, our research shows how bans and deletions already affected the conspiracy theory’s presence in its nascence. Notably, on both 4chan and Reddit there is a clear connection between the deletion or banning of content and a shift or decrease in activity. On 4chan, deletion of Q’s own posts prompts a move to 8chan and other platforms. On Reddit, the early banning of r/CBTS_Stream and later r/greatawakening leads to a lower overall volume of Q-related posts. In an earlier study following the banning of a number of hate speech subreddits, Chandrasekharan, et al. (2017) found that Reddit users “drastically decreased their hate speech usage”. Our findings similarly support the effectiveness of subreddit bans to decrease specific conspiratorial activity within Reddit. Moreover, after the banning of the two subreddits, there is no clear “waterbed effect” on a cross-platform level; none of the included platforms saw a long-term increase in activity proportional to the decrease on Reddit itself [36].

YouTube also seems to be a somewhat volatile environment regarding Q. As can be seen in Figure 4, in the 10 months between our first metadata request in December 2019 and another in October 2020, almost half (47 percent) of the 5,352 collected videos became unavailable. This is either because the videos were banned, contained copyrighted material, or were deleted or hidden by the uploaders themselves. The latter type includes Tracy Beanz, whose videos consistently accrued over 100 thousand views, but who has since distanced herself from her Q-concerned past. However, while YouTube vowed to ban some QAnon videos at the time of writing (Paul, 2020), the majority of Q content remains online. Moreover, the disappearing videos only recently have become unavailable: when we requested the availability of the videos in August 2020, only 5.4 percent were unavailable, with popular pro-Q channels still being online in the summer. Considering the ontrast with earlier and more aggressive measures, as e.g., taken by Reddit, the normiefication of QAnon is thus not merely a question of diffusion from a subcultural fringe to the mainstream, but also of platform governance — or a lack thereof.

 

++++++++++

Conclusion

Drawing from a vernacular term common in chan culture, we used the concept of normiefication as a heuristic device to analyse the spread of fringe ideas and objects from online subcultures to mainstream spaces and the resulting increase in exposure afforded by the shareability of digital platforms. As a case study, we traced the dissemination of the QAnon conspiracy theory from the recesses of the deep vernacular Web to mainstream news outlets. We found that QAnon has clear subcultural origins on 4chan/pol/, then diffuses on various other platforms removed from its subcultural context, before eventually reaching mainstream attention by online news media. While activity on 4chan/pol/ quickly dwindled, partly due to content moderation, 8chan formed a consistent alternative space for Q-discussion. Moreover, YouTube and Reddit already quickly formed alternative habitats for Q-believers before 8chan’s growth. YouTube and Reddit thus seem to have functioned as “bridge platforms”, fueling the conspiracy theory by facilitating communities dedicated to conspiracy thinking, potentially driven by algorithmic visibility, and adding commercial incentives for Q-concerned creators like Tracy Beanz [37]. It was only when the QAnon conspiracy theory moved off-line during the Tampa Bay rally that online news media started widely reporting on the issue. Our findings show this reporting did correlate with increased Q-related activity on the other platforms, but only for a short while.

The normiefication of a conspiracy theory like QAnon does not necessarily mean that it is accepted by more people, and/or disseminated as fact by major media platforms [38]. However, even when the wider exposure of a conspiracy theory beyond its original vernacular context does not necessarily imply an increase in true believers, by reporting on it news media may still provide oxygen to conspiratorial ideas (Phillips, 2018), even if only by inadvertently providing the conspiracist communities with a clear outgroup [39]. The persistent popularity of QAnon well into 2020 does seem to warrant the claim that normiefication is a necessary condition for, and thus parcel to, its potential normalization. Moreover, while normiefication might not directly recruit new believers, it does contribute to the overall “post-truth” (Fuller, 2018) climate, where everything might be equally true or false because there are no longer collectively agreed upon criteria to assess the veracity of information and evaluate knowledge claims. The problem of a media ecology polluted by conspiracy theories is not just that it might induce false belief, but that it contributes to an epistemological condition where the distinction between true and false itself is rendered increasingly moot.

Regarding this issue, our research shows platform governance to be an important contributing factor to QAnon’s continuing online presence: the data we analyzed shows clear indications of the effectiveness of Reddit’s subreddit bans, and shows that a lack of aggressive content policing, notably on 8chan and YouTube, helped facilitate consistent activity.

While QAnon presents a clear case of normiefication from 4chan/pol/ to the online news media, this process is not a straight unidirectional line but going back and forth between different Web spheres in a highly volatile manner. The cross-platform method for tracing normiefication seems promising in highlighting these tides, as well as in mapping the growing influence of discourse emerging from toxic subcultures like 4chan/pol/. Indeed, many inflammatory memes, trolling campaigns, and coordinated disinformation campaigns have in recent years “spread outwards” from /pol/ (Zannettou, et al., 2017; Zannettou, et al., 2018), and many of these can be traced with the keyword-based cross-platform analysis used here. In a time of growing concerns over disinformation, hate speech, and conspiracy theories, we hope similar empirical studies will reuse our methods to aid in “[mapping] the repeated, fractured, reconfiguring mobilizations emerging from anonymous and pseudo-anonymous spaces online” (Phillips, et al., 2017) as we have done in the case of the QAnon conspiracy theory. End of article

 

About the authors

Daniël de Zeeuw is a lecturer in the Department of Media Studies at the University of Amsterdam and a member of OILab (oilab.eu). His current research focuses on the politics and aesthetics of online subcultures.
E-mail: d [dot] dezeeuw [at] uva [dot] nl

Sal Hagen is a Ph.D. candidate in the Department of Media Studies at the University of Amsterdam. He co-founded OILab during his rMA in media studies and has studied Internet subcultures since. Methodologically, his work combines media theory with computational methods.
E-mail: s [dot] h [dot] hagen [at] uva [dot] nl

Stijn Peeters is a post-doctoral researcher in the Department of Media Studies at the University of Amsterdam, and a member of OILab. His current research interests focus on the development of sound research tools and methods for analysis of fringe and historical online platforms.
E-mail: stijn [dot] peeters [at] uva [dot] nl

Emilija Jokubauskaitė is a lecturer in new media & digital culture at the University of Amsterdam and co-founder of Open Intelligence Lab. Her main research interests consist of online political subcultures, the lesser-known online spaces, and platforms as well as scrutiny of research tools and techniques.
E-mail: e [dot] jokubauskaite [at] uva [dot] nl

 

Acknowledgements

The research presented in this article was originally conceived in the context of the Digital Methods Initiative Winter School 2019. We would like to express our gratitude to ngeles Briones, as well as all the students who participated: Carmen Ferri, Birgitte Haanshuus, Rachel Blennerhassett, Esther Blokbergen, Flora Woudstra Hablé, Marlou Poncin, and Willem Hilhorst. We would also like to thank Marc Tuters for the blog post where the term “normiefication” was first used (see: OILab, 2018) and Marc-André Argentino for sharing his QAnon Twitter research.

 

Notes

1. Although probably not as influential on the overall outcome of the elections, it is nevertheless true that “offensive terms and imagery associated with the [alt-right] movement surfaced in the mainstream throughout the election cycle” (Heikkilä, 2017, p. 14). For a critique of overstating the alt right’s influence on the elections, see Phillips, et al. (2017).

2. See https://web.archive.org/web/20151115205740/https:/twitter.com/realDonaldTrump/status/653856168402681856, accessed 4 August 2020.

3. Marwick and Lewis, 2017, pp. 18–19.

4. LARP stands for “Live Action Roleplaying”, a term usually used for physical recreations of fantasy games but on 4chan also used to indicate the practice of “playing” at doing politics.

5. Like with Pizzagate, tweets by QAnon adherents were retweeted by Trump himself (Kaplan, 2019; Rupar, 2020).

6. See https://www.lexico.com/en/definition/fringe, accessed 20 April 2020.

7. Stempel, et al., 2016, p. 355.

8. See https://www.urbandictionary.com/define.php?term=normification, accessed 2 August 2020.

9. Milner, 2016, p. 61. Relatedly, when a mainstream outgroup is baited into replicating toxic discourse, normiefication can be used to refer to intentional efforts of media manipulation by the subculture. For example, initial QAnon-discussion on 4chan involved wishes to “bridge Q to normies”. See e.g., http://archive.4plebs.org/pol/thread/149915235/#149917647.

10. A focus on diffusion or viral spread has even been part of the vernacular of media manipulators themselves: actors like Mike Cernovich succesfully focused their attention on testing and discussing what hashtags would trend and go viral (Marantz, 2019), while a prominent neo-Nazi Web site was uncovered to operate according to a meticulous propaganda playbook intended to “spread the message of nationalism and anti-Semitism to the masses” (Feinberg, 2017). In the run-up to the 2016 U.S. election, memetics sprung into the public attention as the election was intertwined with grassroots manipulators engaging in “meme warfare” (Siegel, 2017). Such efforts were also repeated by QAnon-followers in the runup to the 2020 U.S. election (Thomas, 2020).

11. Rieder proposes the term “refraction” as an alternative to diffusion, adopting the metaphor of a wave passing through a surface, like beams of light that refract on the surface of water. Refraction is as such intended to “further think about the space between identical reproduction and total heterogeneity” to “better take into account questions referring to meaning, rhetoric, and ideology” (Rieder, 2012).

12. The Reddit query filters out the phrase “H I G H Q U A L I T Y”.

13. The data gaps occur because it stores data at the end of every month, while some subreddits were banned before the end of the month. This caused a large omission of posts from the Q-dedicated subreddits.

14. CBTS, an acronym for the phrase “calm before the storm”, as well as the phrase “great awakening”, are other ways to refer to the same or similar conspiracy theories (Landler, 2017).

15. We used “relevance” for the YouTube v3 API’s “order” parameter, which orders videos on how much they specifically align with your keyword query. Videos deleted before 2 December 2019 could not be retrieved. This means many now deleted videos are missing, like a Q-explainer by Lionel Nation which accrued around 200,000 views.

16. For instance, it is unclear what the “weight” of units like video titles and descriptions are in its ranking of relevant videos (let alone channel titles, transcripts, comments, tags, etc.). Moreover, the exact functioning of some API parameters is unclear and volatile. Further, the API’s keyword parameter returns a different number of videos when changing the order of keywords separated by an OR sign (|).

17. The queries used with the q parameter (no pun intended) of the search endpoint of the YouTube’s V3 API were: “q anon|qanon”, “q 4chan”, “q 8chan”, “q clearance”, and “q drop”.

18. This concerned setting the “regionCode” parameter to “US” and the “relevanceLanguage” to “en”. Despite these settings, the API still returned some non-English videos, which were manually deleted.

19. False positives for YouTube videos were deleted in two ways. First, we automatically deleted the videos that did not use Q as a capitalized stand-alone noun or mentioned other relevant terms (“QAnon”, “Q-clearance”, “Q drop”, “The Great Awakening”, or “wwg1wga”) in the video title, video tags, or video description. Following that, we manually deleted any unrelated videos (e.g., one titled “Future and Schoolboy Q Drop New Song”) and non-English videos.

20. We started the timeframe interval for Breitbart a week prior to the general study interval to avoid missing comments submitted after 28 October 2017 as a reply to articles published before that date.

21. Most of the others did not allow querying by (year-old) date ranges (Google News API, Bing News API), returned too little results (GNews API) or required large payment plans for historical data (News API). Enough results and querying for the full date ranges was vital to avoid date biases, so these APIs were discarded.

22. We suspected querying “QAnon” would miss many articles that referred to the conspiracy with just “Q”, but after running separate queries containing “Q” and other relevant terms (e.g., “4chan”), no additional relevant articles were found.

23. Since we are concerned with online data, it was a requirement that the news articles had to be retrievable online at the time of research through Google Search. We additionally excluded a range of Web sites: niche or personal blogs, hyperpartisan or conspiratorial Web sites, specialist, defunct or satire Web sites and other non-news Web sites. Articles with transcripts (e.g., of television shows) and listicles with “top news” were also deleted..

24. As a minor change, the first story in the dataset was mislabelled by Nexis Uni as originating from the Yerepouni Daily Press, but was originally published on Newsweek. We changed this to the latter to accurately pinpoint the start of the news media’s attention to Q.

25. The datasets can be downloaded at https://zenodo.org/record/3758479.

26. Facebook has gradually closed its API endpoints, both prior to and following the Cambridge Analytica scandal (Rieder, 2018). While Twitter is more accessible, retroactively acquiring data is financially untenable.

27. The proclamation “we are the new mainstream” is often expressed by alternative influencers like Mike Cernovich (Marantz, 2020), and also shows up in the channel descriptions of many of the YouTube channels in our dataset.

28. General threads are recurring, manually posted, thematically coherent threads that serve as a nexus for the discussion about a particular topic (Jokubauskaitė and Peeters, 2020).

29. Using the 4chan/pol/ posts to extract word pairs, the word “larp” appeared 1,636 times close to the word “Q” in the posts from 28 October 2017 to 1 January 2018.

30. Archived versions of these deleted posts — identified by Q’s “tripcode”, a 4chan mechanism of user authentication — can be retrieved via 4plebs, a public 4chan archive: http://archive.4plebs.org/pol/search/tripcode/%21ITPb.qbhqo/deleted/deleted/page/3/.

31. See the Wayback Machine’s page for r/politics on 2 August 2018: https://web.archive.org/web/20180802014454/https://www.reddit.com/r/politics/.

32. While the 122 million QAnon views on YouTube is a high number, it is admittedly a far cry from the largest players on the platform. For instance, PewDiePie’s channel alone garnered 20 times as many views in December 2019 (see https://web.archive.org/web/20191210163947/https://www.youtube.com/user/PewDiePie/about).

33. In recent years, YouTube has changed its algorithms to prevent the recommendation of conspiracy theory videos. However, during late 2018, the last months of our study’s timeframe, YouTube’s recommender algorithm is said to have contributed to a “conspiracy boom”, including QAnon (Faddoul, et al., 2020).

34. Phillips and Milner, 2021, p. 13.

35. Gallagher, et al. (2020) note: “numbers of individuals discussing QAnon increased significantly between 31 July and 2 August 2018, when Q followers were first spotted at a Trump rally in Tampa, Florida. In this instance, unique Twitter users rose above 100,000 for the first time. A similar trend was observed on Facebook, where user numbers more than doubled during those dates”.

36. Since activity did not substantially increase on Facebook, and Twitter only saw a brief increase in mid-2018 (Gallagher, et al., 2020), our findings suggest subreddit bans resulted in decreased Q-presence within online information ecologies more broadly.

37. Based on our findings, it is not yet clear who were the key intermediary actors between the digital “underground” and the social network of Trump supporters, nor how influential their role was. Pinpointing this could be informative in detailing the role of individuals in normiefying conspiracy theories.

38. An answer to the question of how the reference to QAnon is used in different contexts and on different Web spheres would require more qualitative cultural and semiotic analysis or quantitative text analysis. In other words, in addition to tracing its diffusion one needs to tracing its refraction (Rieder, 2012). A discourse analysis of the different semantic environments in which the term QAnon operates may more sophisticatedly identify and differentiate between issue publics engaging with QAnon as “true believers”, those that are merely in it for a laugh, and those that engage with QAnon as a conspiracy theory in the pejorative sense, i.e., as a wildly improbable idea adopted by a political fringe group.

39. There is a curious paradox here that seems to be quite unique to conspiracy theories, namely that the credibility of a conspiracy theory is often inversely proportional to its acceptance by the mainstream public. Following the conspiratorial mode of reasoning, if the conspiracy needs to stay hidden in order to be effective (and is actively suppressed by the powers that be, including the media), then its public acceptance must be a sign of the conspiracy’s dwindling influence.

 

References

Marc-André Argentino, 2020. “QAnon conspiracy theory followers step out of the shadows and may be headed to Congress,” The Conversation (8i July), at https://theconversation.com/qanon-conspiracy-theory-followers-step-out-of-the-shadows-and-may-be-headed-to-congress-141581, accessed 4 August 2020.

Jason Baumgartner, Savvas Zannettou, Brian Keegan, Megan Squire, and Jeremy Blackburn, 2020. “The Pushshift Reddit dataset,” arXiv:2001.08435 (23 January), at http://arxiv.org/abs/2001.08435, accessed 20 April 2020.

Yochai Benkler, Robert Faris, and Hal Roberts, 2018. Network propaganda: Manipulation, disinformation, and radicalization in American politics. Oxford: Oxford University Press.
doi: https://doi.org/0.1093/oso/9780190923624.001.0001, accessed 24 October 2020.

Dale Beran, 2017. “4chan: The skeleton key to the rise of Trump,” Huffington Post (20 February), at https://www.huffingtonpost.com/entry/4chan-the-skeleton-key-to-the-rise-of-trump_us_58ab6156e4b0a855d1d8dfe4, accessed 20 April 2020.

Susan Blackmore, 1999. The meme machine. Oxford: Oxford University Press.

danah boyd, 2017. “Did media literacy backfire?” Data & Society (5 January), at https://points.datasociety.net/did-media-literacy-backfire-7418c084d88d, accessed 20 April 2020.

Sean Chabot and Jan Willem Duyvendak, 2002. “Globalization and transnational diusion between social movements: Reconceptualizing the dissemination of the Gandhian repertoire and the “coming out’ routine,” Theory and Society, volume 31, pp. 697–740.
doi: https://doi.org/10.1023/A:1021315215642, accessed 24 October 2020.

Eshwar Chandrasekharan, Umashanthi Pavalanathan, Anirudh Srinivasan, Adam Glynn, Jacob Eisenstein, and Eric Gilbert, 2017. “You can’t stay here: The efficacy of Reddit’s 2015 ban examined through hate speech,” Proceedings of the ACM on Human-Computer Interaction, article number 31.
doi: https://doi.org/10.1145/3134666, accessed 4 August 2020.

Alvin Chang, 2018. “We analyzed every QAnon post on Reddit. Here’s who QAnon supporters actually are,” Vox (8i August), at https://www.vox.com/2018/8/8/17657800/qanon-reddit-conspiracy-data, accessed 20 April 2020.

Noam Chomsky, 1997. “What makes mainstream media mainstream,” Z Magazine, at https://chomsky.info/199710__/, accessed 20 April 2020.

Ben Collins and Brandy Zadrozny, 2020. “Twitter bans 7,000 QAnon accounts, limits 150,000 others as part of broad crackdown,” NBC News (21 July), at https://www.nbcnews.com/tech/tech-news/twitter-bans-7-000-qanon-accounts-limits-150-000-others-n1234541, accessed 4 August 2020.

Gabriela Cosentino, 2020. “From Pizzagate to the great replacement: The globalization of conspiracy theories,&frdquo; In: Gabriela Cosentino. Social media and the post-truth world order: The global dynamics of disinformation. Cham, Switzerland: Palgrave Pivot, pp. 59–86.
doi: https://doi.org/10.1007/978-3-030-43005-4_3, accessed 24 October 2020.

Gilad Edelman, 2020. “Twitter cracks down on QAnon. Your move, Facebook,” Wired (22 July), at https://www.wired.com/story/twitter-cracks-down-qanon-policy/, accessed 4 August 2020.

Greg Elmer and Ganaele Langlois, 2013. “Networked campaigns: Traffic tags and cross platform analysis on the Web,” Information Polity, volume 18, number 1, pp. 43–56.
doi: https://doi.org/10.3233/IP-2011-0244, accessed 24 October 2020.

Marc Faddoul, Guillaume Chaslot, and Hany Farid, 2020. “A longitudinal analysis of YouTube’s promotion of conspiracy videos,” arXiv:2003.03318v1 (6 March), at https://arxiv.org/abs/2003.03318, accessed 4 August 2020.

Ashley Feinberg, 2017. “This is the Daily Stormer’s playbook,” Huffington Post (13 December), at https://www.huffpost.com/entry/daily-stormer-nazi-style-guide_n_5a2ece19e4b0ce3b344492f2, accessed 20 April 2020.

Kyle Feldscher, 2018. “QAnon-believing ‘conspiracy analyst’ meets Trump in the White House,” CNN (25 August), at https://www.cnn.com/2018/08/25/politics/donald-trump-qanon-white-house/index.html, accessed 20 April 2020.

Marc Fisher, John Woodrow Cox, and Peter Hermann, 2016. “Pizzagate: From rumor, to hashtag, to gunfire in D.C.,” Washington Post, at https://www.washingtonpost.com/local/pizzagate-from-rumor-to-hashtag-to-gunfire-in-dc/2016/12/06/4c7def50-bbd4-11e6-94ac-3d324840106c_story.html, accessed 20 April 2020.

Steve Fuller, 2018. Post-truth: Knowledge as a power game. London: Anthem Press.

Aoife Gallagher, Jacob Davey, and Mackenzie Hart, 2020. “The genesis of a conspiracy theory: Key trends in QAnon activity since 2017,” at https://www.isdglobal.org/wp-content/uploads/2020/07/The-Genesis-of-a-Conspiracy-Theory.pdf, accessed 4 August 2020.

Alex Goldenberg and Joel Finkelstein, 2020. “Cyber swarming, memetic warfare, and viral insurgency: How domestic militants organize on memes to incite violent insurrection and terror against government and law enforcement,” Network Contagion Research Institute, at https://ncri.io/reports/cyber-swarming-memetic-warfare-and-viral-insurgency-how-domestic-militants-organize-on-memes-to-incite-violent-insurrection-and-terror-against-government-and-law-enforcement/, 4 August 2020.

David Gilbert, 2020. “QAnon now has its very own super PAC,” Vice (2 March), at https://www.vice.com/amp/en_us/article/k7eppz/qanon-now-has-its-very-own-super-pac, accessed 20 April 2020.

Laurie E. Gries, 2015. Still life with rhetoric: A new materialist approach for visual rhetorics. Logan: Utah State University Press.

Margeret Hall, Athanasios Mazarakis, Isabella Peters, Martin Chorley, Simon Caton, Jens-Erik Mai, and Markus Strohmaier, 2016. “Following user pathways: Cross platform and mixed methods analysis in social media studies,” CHI EA ’16: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 3,400–3,407.
doi: https://doi.org/10.1145/2851581.2856500, accessed 24 October 2020.

Jason Harambam and Stef Aupers, 2015. “Contesting epistemic authority: Conspiracy theories on the boundaries of science,” Public Understanding of Science, volume 24, number 4, pp. 466–480.
doi: https://doi.org/10.1177/0963662514559891, accessed 24 October 2020.

Dick Hebdige, 1979. Subculture, the meaning of style. London: Methuen.

Niko Heikkilä, 2017. “Online antagonism of the alt-right in the 2016 election,” European Journal of American Studies, volume 12, number 2, document 5.
doi: https://doi.org/10.4000/ejas.12140, accessed 20 April 2020.

Anne Helmond, 2015. “The platformization of the Web: Making Web data platform ready,” Social Media + Society (30 September).
doi: https://doi.org/10.1177/2056305115603080, accessed 4 August 2020.

Richard Hofstadter, 1965. Paranoid style in American politics, and other essays. New York: Knopf.

Louis Jacobson, 2016. “Donald Trump’s ‘Star of David’ tweet: A recap” (5 July), at https://www.politifact.com/article/2016/jul/05/donald-trumps-star-david-tweet-recap/, accessed 4 August 2020.

Emilija Jokubauskaitė and Stijn Peeters, 2020. “Generally curious: Thematically distinct datasets of general threads on 4chan/pol/,” Proceedings of the Fourteenth International AAAI Conference on Web and Social Media, volume 14, number 1, pp. 863–867, and at https://www.aaai.org/ojs/index.php/ICWSM/article/view/7351, accessed 24 October 2020.

Alex Kaplan, 2019. “Trump has repeatedly amplified QAnon Twitter accounts. The FBI has linked the conspiracy theory to domestic terror” (1 August), at https://www.mediamatters.org/twitter/fbi-calls-qanon-domestic-terror-threat-trump-has-amplified-qanon-supporters-twitter-more-20, accessed 20 April 2020.

Elihu Katz and Paul F. Lazarsfeld, 1955. Personal influence: The part played by people in the flow of mass communications. Glencoe, Ill.: Free Press.

Christine Lagorio-Chafkin, 2018. We are the nerds: The birth and tumultuous life of Reddit, the Internet’s culture laboratory. London: Piatkus.

Mark Landler, 2017. “What did President Trump mean by ‘calm before the storm’?” New York Times (6 October), at https://www.nytimes.com/2017/10/06/us/politics/trump-calls-meeting-with-military-leaders-the-calm-before-the-storm.html, accessed 20 April 2020.

Rebecca Lewis, 2019. “‘This is what the news won’t show you’: YouTube creators and the reactionary politics of micro-celebrity,” Television & New Media, volume 21, number 2, pp. 201–217.
doi: https://doi.org/10.1177/1527476419879919, accessed 24 October 2020.

Yung-Ming Li and Ya-Lin Shiu, 2012. “A diffusion mechanism for social advertising over microblogs,” Decision Support Systems, volume 54, number 1, pp. 9–22.
doi: https://doi.org/10.1016/j.dss.2012.02.012, accessed 24 October 2020.

Ioana Literat and Sarah van den Berg, 2019. “Buy memes low, sell memes high: Vernacular criticism and collective negotiations of value on Reddit’s MemeEconomy,” Information, Communication & Society, volume 22, number 2, pp. 232–249.
doi: https://doi.org/10.1080/1369118X.2017.1366540, accessed 24 October 2020.

Aaron Mak, 2018. “How the QAnon conspiracy theory went from the fringes of the Internet to t-shirts for sale on Amazon,” Slate (10 August), at https://slate.com/technology/2018/08/qanon-conspiracy-theory-merchandise-fringes-shirts-for-sale-amazon.html, accessed 20 April 2020.

Andrew Marantz, 2019. Antisocial: Online extremists, techno-utopians, and the hijacking of the American conversation. New York: Viking.

Alice Marwick and Rebecca Lewis, 2017. “Media manipulation and disinformation online,” Datas & Society (15 May), at https://datasociety.net/library/media-manipulation-and-disinfo-online/, accessed 20 April 2020.

Michele Mauri, Tommaso Elli, Giorgio Caviglia, Giorgio Uboldi, and Matteo Azzi, 2017. “RAWGraphs: A visualisation platform to create open outputs,” CHItaly ’17: Proceedings of the 12th Biannual Conference on Italian SIGCHI Chapter, article number 28, pp. 1–5.
doi: https://doi.org/10.1145/3125571.3125585, accessed 20 April 2020.

Mike McIntire and Kevin Roose, 2020. “What happens when QAnon seeps from the Web to the offline world,” new York Times (9 February), at https://www.nytimes.com/2020/02/09/us/politics/qanon-trump-conspiracy-theory.html, accessed 20 April 2020.

Timothy McLaughlin, 2019. “The weird, dark history of 8chan,” Wired (6 August), at https://www.wired.com/story/the-weird-dark-history-8chan/, accessed 4 August 2020.

Paul Mihailidis and Samantha Viotty, 2017. “Spreadable spectacle in digital culture: Civic expression, fake news, and the role of media literacies in ‘post-fact’ society,” American Behavioral Scientist, volume 61, number 4, pp. 441–454.
doi: https://doi.org/10.1177/0002764217701217, accessed 24 October 2020.

Ryan M. Milner, 2016. The world made meme: Public conversations and participatory media. Cambridge, Mass.: MIT Press.

Evgeny Morozov, 2009. “Iran: Downside to the ‘Twitter revolution’,” Dissent (Fall), pp. 10–14, and at https://www.evgenymorozov.com/morozov_twitter_dissent.pdf, accessed 24 October 2020.

Russell Muirhead and Nancy L. Rosenblum, 2019. A lot of people are saying: The new conspiracism and the assault on democracy. Princeton, N.J.: Princeton University Press.

Luke O’Brien, 2017. “The making of an American nazi,” Atlantic (December), at https://www.theatlantic.com/magazine/archive/2017/12/the-making-of-an-american-nazi/544119/, accessed 4 August 2020.

Kari Paul, 2020. “YouTube announces plans to ban content related to QAnon,” Guardian (15 October), at https://www.theguardian.com/technology/2020/oct/15/youtube-ban-qanon-content-technology, accessed 16 October 2020.

OILab, 2018. “QAnon: On protest LARPing and the normiefication of 4chan’s bullshit” (10 August), at https://oilab.eu/qanon-on-protest-larping-and-the-normiefication-of-4chans-bullshit/, accessed 20 April 2020.

Stijn Peeters and Sal Hagen, 2018. “4CAT: Capturing and analysis toolkit,” at https://github.com/digitalmethodsinitiative/4cat, accessed 20 April 2020.

Sarah Perez, 2019. “Reddit’s monthly active user base grew 30% to reach 430M in 2019” (4 December), at http://social.techcrunch.com/2019/12/04/reddits-monthly-active-user-base-grew-30-to-reach-430m-in-2019/, accessed 4 August 2020.

Whitney Phillips, 2020. “Please, please, please don’t mock conspiracy theories,” Wired (27 February), at https://www.wired.com/story/please-please-please-dont-mock-conspiracy-theories/, accessed 20 April 2020.

Whitney Phillips, 2018. “The oxygen of amplification: Better practices for reporting on extremists, antagonists, and manipulators online,” Data & Society (22 May), at https://datasociety.net/library/oxygen-of-amplification/, accessed 20 April 2020.

Whitney Phillips and Ryan Milner, 2021. You are here: A field guide for navigating polarized speech, conspiracy theories, and our polluted media landscape. Cambridge, Mass.: MIT Press.

Whitney Phillips, Jessica Beyer, and Gabriella Coleman, 2017. “Trolling scholars debunk the idea that the alt-right’s shitposters have magic powers,” Vice (22 March), at https://motherboard.vice.com/en_us/article/z4k549/trolling-scholars-debunk-the-idea-that-the-alt-rights-trolls-have-magic-powers, accessed 20 April 2020.

Liza Potts and Angela Harrison, 2013. “Interfaces as rhetorical constructions: Reddit and 4chan during the Boston Marathon bombings,” SIGDOC ’13: Proceedings of the 31st ACM International Conference on Design of Communication, pp. 143–150.
doi: https://doi.org/10.1145/2507065.2507079, accessed 2 August 2020.

Chris Preist, Elaine Massung, and David Coyle, 2014. “Competing or aiming to be average? Normification as a means of engaging digital volunteers,” CSCW ’14: Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing, pp. 1,222–1,233.
doi: https://doi.org/10.1145/2507065.2507079, accessed 24 October 2020.

Walter Quattrociocchi, Guido Caldarelli, and Antonio Scala, 2014. “Opinion dynamics on interacting networks: Media competition and social influence,” Scientific Reports, volume 4, article number 4938.
doi: https://doi.org/10.1038/srep04938, accessed 4 August 2020.

Bernhard Rieder, 2018. “Facebook’s app review and how independent research just got a lot harder,” Politics of Systems (11 August), at http://thepoliticsofsystems.net/2018/08/facebooks-app-review-and-how-independent-research-just-got-a-lot-harder/, accessed 20 April 2020.

Bernhard Rieder, 2012. “The refraction chamber: Twitter as sphere and network,” First Monday, volume 17, number 11, at https://firstmonday.org/article/view/4199/3359, accessed 20 April 2020.
doi: https://doi.org/10.5210/fm.v17i11.4199, accessed 24 October 2020.

Everett M. Rogers, 1962. Diffusion of innovations. New York: Free Press of Glencoe.

Mark Rogers, Clovis Chapman, and Vasileios Giotsas, 2012. “Measuring the diffusion of marketing messages across a social network,” Journal of Direct, Data and Digital Marketing Practice, volume 14, pp. 97–130.
doi: https://doi.org/10.1057/dddmp.2012.25, accessed 24 October 2020.

Richard Rogers, 2017. “Digital methods for cross-platform analysis,” In: Jean Burgess, Alice Marwick, and Thomas Poell (editors). Sage handbook of social media. Los Angeles, Calif.: Sage, pp. 91–110.
doi: http://dx.doi.org/10.4135/9781473984066.n6, accessed 24 October 2020.

Kevin Roose, 2018. “‘False flag’ theory on pipe bombs zooms from right-wing fringe to mainstream,” New York Times (25 October), at https://www.nytimes.com/2018/10/25/business/false-flag-theory-bombs-conservative-media.html, accessed 20 April 2020.

Jonathan Rose, 2017. “Brexit, Trump, and post-truth politics,” Public Integrity, volume 19, number 6, pp. 555–558.
doi: https://doi.org/10.1080/10999922.2017.1285540, accessed 24 October 2020.

Matthew Rosenberg and Jennifer Steinhauer. 2020. “The QAnon candidates are here. Trump has paved their way,” New York Times (14 July), at https://www.nytimes.com/2020/07/14/us/politics/qanon-politicians-candidates.html, accessed 2 August 2020.

Aaron Rupar, 2020. “Trump spent his holidays retweeting QAnon and Pizzagate accounts,” Vox (2 January), at https://www.vox.com/policy-and-politics/2020/1/2/21046707/trump-qanon-pizzagate-retweets, accessed 20 April 2020.

Tony D. Sampson, 2012. Virality: Contagion theory in the age of networks. Minneapolis: University of Minnesota Press.

Jieun Shin, Lian Jian, Kevin Driscoll, and François Bar, 2018. “The diffusion of misinformation on social media: Temporal pattern, message, and source,” Computers in Human Behavior, volume 83, pp. 278–287.
doi: https://doi.org/10.1016/j.chb.2018.02.008, accessed 24 October 2020.

Jacob Siegel, 2017. “Is America prepared for meme warfare?” Vice (31 January), at https://www.vice.com/en_us/article/xyvwdk/meme-warfare, accessed 20 April 2020.

Isaac Stanley-Becker, 2018. “‘We are Q’: A deranged conspiracy cult leaps from the Internet to the crowd at Trump’s ‘MAGA’ tour,” Washington Post (1 August), at https://www.washingtonpost.com/news/morning-mix/wp/2018/08/01/we-are-q-a-deranged-conspiracy-cult-leaps-from-the-internet-to-the-crowd-at-trumps-maga-tour/, accessed 20 April 2020.

Nick Statt, 2020. “Facebook completely bans QAnon and labels it a ‘militarized social movement’,” The Verge (6 October), at https://www.theverge.com/2020/10/6/21504887/facebook-qanon-ban-all-apps-pages-groups-instagram-accounts, accessed 16 October 2020.

Carl Stempel, Thomas Hargrove, and Guido H. Stempel, III, 2016. “Media use, social structure, and belief in 9/11 conspiracy theories,” Journalism & Mass Communication Quarterly, volume 84, number 2, pp. 353–372.
doi: https://doi.org/10.1177/107769900708400210, accessed 20 April 2020.

Gabriel Tarde, 1903. The laws of imitation. Translated by Elsie Clews Parsons. New York: Henry Holt.

Elise Thomas, 2020. “Qanon deploys ‘information warfare’ to influence the 2020 election,” Wired (17 February), at https://www.wired.com/story/qanon-deploys-information-warfare-influence-2020-election/, accessed 20 April 2020.

Zeynep Tufekci, 2018. “YouTube, the great radicalizer,” New York Times (10 March), at https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html, accessed 20 April 2020.

Marc Tuters, 2020. “The birth of QAnon: On how 4chan invents a conspiracy theory” (9 July), at https://oilab.eu/the-birth-of-qanon-on-how-4chan-invents-a-conspiracy-theory/, accessed 4 August 2020.

Marc Tuters, Emilija Jokubauskaitė, and Daniel Bach, 2018. “Post-truth protest: How 4chan cooked up the Pizzagate bullshit,” M/C Journal, volume 21, number 3, at http://journal.media-culture.org.au/index.php/mcjournal/article/view/1422, accessed 20 April 2020.
doi: https://doi.org/10.5204/mcj.1422, accessed 24 October 2020.

Joseph E. Uscinski and Adam M. Enders, 2020. “The coronavirus conspiracy boom,” Atlantic (30 April), at https://www.theatlantic.com/health/archive/2020/04/what-can-coronavirus-tell-us-about-conspiracy-theories/610894/, accessed 11 July 2020.

Travis View, 2018. “The insane QAnon conspiracy theory is a big budget sequel to Pizzagate” (17 August), at https://arcdigital.media/the-insane-qanon-conspiracy-theory-is-a-big-budget-sequel-to-pizzagate-145abb34bdc2, accessed 20 April 2020.

Esther Weltevrede and Erik Borra, 2016. “Platform affordances and data practices: The value of dispute on Wikipedia,” Big Data & Society.
doi: https://doi.org/10.1177/2053951716653418, accessed 4 August 2020.

Julia Carrie Wong, 2020. “Down the rabbit hole: How QAnon conspiracies thrive on Facebook,” Guardian (25 June), at https://www.theguardian.com/technology/2020/jun/25/qanon-facebook-conspiracy-theories-algorithm, accessed 4 August 2020.

Andrew Wyrich, 2018. “Reddit bans popular deep state conspiracy forum for ‘inciting violence’,” Daily Dot (15 March), at https://www.dailydot.com/debug/reddit-bans-r-cbts_stream/, accessed 20 April 2020.

Brandy Zadrozny and Ben Collins, 2018. “How three conspiracy theorists took ‘Q’ and sparked Qanon,” NBC News (14 August), at https://www.nbcnews.com/tech/tech-news/how-three-conspiracy-theorists-took-q-sparked-qanon-n900531, accessed 20 April 2020.

Savvas Zannettou, Tristan Caulfield, Jeremy Blackburn, Emiliano De Cristofaro, Michael Sirivianos, Gianluca Stringhini, and Guillermo Suarez-Tangil, 2018. “On the origins of memes by means of fringe Web communities,” arXiv:1805.12512 (31 May), at http://arxiv.org/abs/1805.12512, accessed 20 April 2020.

Savvas Zannettou, Tristan Caulfield, Emiliano De Cristofaro, Nicolas Kourtelris, Ilias Leontiadis, Michael Sirivianos, Gianluca Stringhini, and Jeremy Blackburn, 2017. “The Web centipede: Understanding how Web communities influence each other through the lens of mainstream and alternative news sources,” IMC ’17: Proceedings of the 2017 Internet Measurement Conference, pp. 405–417, at https://conferences.sigcomm.org/imc/2017/papers/imc17-final145.pdf, accessed 20 April 2020.

Daniël de Zeeuw and Marc Tuters, 2020. “The Internet is serious business: On the deep vernacular Web and its discontents,” Cultural Politics, volume 16, number 2, pp. 214–232.
doi: https://doi.org/10.1215/17432197-8233406, accessed 4 August 2020.

 

Appendix I: Core SQL query

WHERE (body LIKE 'Q %'
       OR body LIKE '% Q %'
       OR body LIKE '% Q'
       OR body LIKE '% Q,%'
       OR body LIKE '% Q.%'
       OR body LIKE '% Q!%'
       OR body LIKE '% Q?%'
       OR body LIKE '% Q\n%'
       OR lower(body) LIKE '%qanon%'
       OR lower(body) LIKE '%q-anon%'
       OR lower(body) LIKE '%q clearance%'
       OR lower(body) LIKE '%q-clearance%')
       AND lower(body) NOT LIKE '%q and a %'
       AND lower(body) NOT LIKE %q & a%
)

 

Appendix II: Selected subreddits

 


Editorial history

Received 20 April 2020; revised 7 August 2020; accepted 15 September 2020.


Copyright © 2020, Daniël de Zeeuw, Sal Hagen, Stijn Peeters, and Emilija Jokubauskaitė. All Rights Reserved.

Tracing normiefication: A cross-platform analysis of the QAnon conspiracy theory
by Daniël de Zeeuw, Sal Hagen, Stijn Peeters, and Emilija Jokubauskaitė.
First Monday, Volume 25, Number 11 - 2 November 2020
https://firstmonday.org/ojs/index.php/fm/article/download/10643/9998
doi: https://dx.doi.org/10.5210/fm.v25i11.10643