First Monday

The Hydroxychloroquine Twitter War: A case study examining polarization in science communication by Alessandro R. Marcon and Timothy Caulfield

The COVID-19 pandemic has created communication challenges exacerbated by the circulation of misinformation and the politicization of science. The case of hydroxychloroquine is an illustrative example, with the drug being aggressively promoted as a cure even while emerging evidence demonstrated the contrary. This research analyzed how hydroxychloroquine discussions took place on Twitter from 21 to 28 April 2020, a key period in developments around the drug. We collected, in real time, tweets with “hydroxychloroquine” over this period, which resulted in a dataset of nearly one million tweets from over 350,000 Twitter accounts. Our content analysis provides specific details of how hydroxychloroquine was promoted and critiqued, and which accounts were tweeting. Findings showed a highly polarized environment with active bots and conspiracy propagators, where political perspectives dominated the Twittersphere in the place of science-focused discussions.


The hydroxychloroquine context
Polarization and science communication
Twitter user accounts
Discussion and conclusion




The COVID-19 pandemic has created widespread communication challenges. In addition to the immensely complex task of communicating policies amidst evolving science and policy clashes, misinformation has been circulating in the media, including outlandish conspiracy theories and baseless remedies (Van Bavel, et al., 2020; Bridgman, et al., 2020; Henley and McIntyre, 2020). Research has shown misinformation to be especially prevalent on social media (Bridgman, et al., 2020; Chen, et al., 2020; Quinn, et al., 2020) and some research has identified former president Donald Trump as playing a key role in the spreading of inaccurate COVID-19 information (Evanega, et al., 2020; Ugarte, et al., 2021). In late March 2020, Trump began promoting hydroxychloroquine as a coronavirus treatment and continued to do so into April when research began raising doubts about the drug’s efficacy (Solender, 2020). Our study examined how hydroxychloroquine was portrayed on Twitter during that time.



The hydroxychloroquine context

In March 2020, non-peer-reviewed research out of France led by corresponding author Dr. Didier Raoult suggested that hydroxychloroquine could be used to treat COVID-19 (Gautret, et al., 2020). The study received significant media attention and a high number of citations but was also widely criticized for severe methodological flaws (Voss, et al., 2020; Rosendaal, 2020). Public interest in hydroxychloroquine rose in relation to the endorsements of hydroxychloroquine by prominent public figures — most notably former president Trump — who began to speak of the drug’s potential, often referencing Raoult’s study (Liu, et al., 2020). Off-label prescriptions of the drug increased, and hydroxychloroquine became a common topic in COVID-related public discourse (Vaduganathan, et al., 2020; Liu, et al., 2020).

From 21 April 21 to 24 April 24, hydroxychloroquine’s lack of effectiveness began circulating in the media, including clinical results and recommendations from health authorities (Palca, 2020; Magagnoli, et al., 2020; U.S. Food & Drug Administration, 2020). At the same time, numerous individuals emerged in the hydroxychloroquine media story including Dr. Raoult; the Director of the National Institute of Allergy and Infectious Diseases, Dr. Anthony Fauci; American immunologist and whistleblower, Dr. Rick Bright, who questioned the funding of hydroxychloroquine; and Democrat member of the Michigan House of Representatives Karen Whitsett, who claimed hydroxychloroquine cured her case of COVID-19 and then publicly praised Trump for promoting the drug.

By mid-April, some COVID-related media analysis on hydroxychloroquine had taken place, showing, for example, that Fox News had promoted hydroxychloroquine as a potential treatment considerably more often than CNN and MSNBC during a week-long period in March (Savillo, 2020). Over the past decade, there has been extensive research on the spreading of health-related (mis)information (Wang, et al., 2019; Broniatowski, et al., 2018), which now often includes analyses of automated social media accounts (bots) as well as conspiratorial elements of media discourse (Shao, et al., 2018; Broniatowski, et al., 2018; Starbird, 2017; Quinn, et al., 2020; Ferrara, et al., 2020; Wojcik, 2018; Allem, et al., 2020). In this respect, the once-fringe QAnon conspiracy theory had gained significant attention during Trump’s presidency, as its subscribers and promoters almost unilaterally aligned themselves politically with Trump, a phenomenon which many scholars and journalists have discussed (Zuckerman, 2019; Amarasingam and Argentino, 2020; Mac and Lytvynenko, 2020; Ferrara, et al., 2020).



Polarization and science communication

A wealth of research has been produced illustrating the polarization — and often argued increased polarization — of the American public (Lees and Cikara, 2021; Rekker, 2021; Gidron, et al., 2019; Iyvengar, et al., 2019; Rogowski and Sutherland, 2016; Nai and Maier, 2021; Abramowitz and McCoy, 2019; Heltzel and Laurin, 2020; Cornelson and Miloucheva, 2020), a phenomenon also observed in other national contexts (Gidron, et al., 2019; Bessi, et al., 2016; Lee, et al., 2018; Carothers and O’Donahue, 2019). Although numerous definitions and understandings of polarization exist, polarization broadly refers to cases where individuals or groups hold starkly opposing perspectives and where achieving consensus appears extremely difficult if not impossible. (O’Connor and Weatherall, 2018). Polarization is often discussed in terms of political, partisan, or “affective” polarization, defined as when political party identification and allegiance generates significant mistrust as well as preferential treatment towards party allies and animosity towards party opposites (Gidron, et al., 2019). Other distinctions of polarization can include, for example, “ideological” polarization (Gidron, et al., 2019; Iyvengar, et al., 2019), which can be summarized as being issue-focused (e.g., gun control, abortion, health care regulation, social activist movements [e.g., Black Lives Matter], immigration, etc.) and not necessarily contingent on political party affiliation. An example of this distinction in the American context is abortion, where survey research demonstrated clashes between the public’s beliefs and their political party affiliation (Diamant, 2020), and where social media research has found high polarization in online discussions based on ideology, and not necessarily political allegiances (Yardi and boyd, 2010).

However, not all discussions of polarization focus on the split between political (“affective”) and “ideological,” instead focusing on polarization dynamics in relation to science (Rekker, 2021). Here, “science polarization” illustrates how political or ideological positions can override the authority of science (procedures, authority figures, etc.) and scientific consensus (Rekker, 2021). The topic of climate change, for instance, has been studied extensively through the lens of scientific polarization (Boudet, et al., 2020; Rekker, 2021; Hart and Nisbet, 2012). So too has the topic of vaccine and vaccine-hesitancy discourse (Schmidt, et al., 2018) especially during the COVID pandemic (Hornsey, et al., 2020; Latkin, et al., 2021), where many have explored the topic of polarization (Pennycook, et al., 2021; Cornelson and Miloucheva, 2020). Indeed, many scholars during the pandemic have reflected upon the “politicization” of science, and specifically regarding the case of hydroxychloroquine (Saag, 2020; O’Connor and Weatherall, 2020; Somberg, 2020; Stone, 2020).

Social media use is increasing worldwide (Datareportal, 2021) influencing health-related perspectives and related decisions, including during the pandemic (e.g., Bridgman, et al., 2020; Puri, et al., 2020). It is, therefore, crucial to better understand how polarization affects the development of discussions about science and health issues on social media. An abundance of research examines polarization in relation to media and social media discourse (Martin and McCrain, 2019; Tucker, et al., 2018; Boxell, et al., 2017; Bessi, et al., 2016; Hart and Nisbet, 2012; McCright and Dunlap, 2011). Some studies show how online exposure to contrary perspectives can increase polarization (Bail, et al., 2018; Iyvengar, et al., 2019), although other studies suggest that different social factors might foster polarization more than social media (Barberá, 2015; Boxell, et al., 2017; Iyengar, et al., 2019).

Our study examined the portrayal of hydroxychloroquine on Twitter during one week in late April 2020 with the perspective that productive scientific discussions would feature tweets focused on the drug’s merits and limitations, and mention the specific studies substantiating those perspectives. If the tweeting population were following the research, we expected to see more tweets criticizing the drug’s effectiveness than arguing for its use. We hypothesized, however, that we would see a polarized twitter sphere with comparable numbers criticizing and supporting hydroxychloroquine. Because we expected to see highly politicized and polarized discussions — a finding which has since been confirmed by similar research (Blevins, et al., 2021) — our primary objective was to detail how the drug was portrayed, and how the polarization was unfolding. This analysis included the types of storylines and URL links shared, arguments made, and information on the accounts contributing to the discussions.




Using Twitter’s application programming interface, we collected, in real time, all tweets and relevant metadata (user account, date published etc.), containing the word “hydroxychloroquine” from 21 April (19:00 EST) to 28 April (19:00 EST) 2020. Collecting in real time enables the tracking of tweets that get retweeted by other users, but it cannot capture the accumulation of likes or replies. Data collected from Twitter is considered public data and therefore does not require ethics review for analysis (see, for example, Kim, et al., 2019). Following our methodological approach using inductive and deductive coding procedures as described in previous research (Marcon, et al., 2018; Marcon, et al., 2016; Du, et al., 2016), we performed directed content analysis (Assarroudi, et al., 2018) to see what kinds of user accounts were tweeting, and how hydroxychloroquine was portrayed in the tweets. The inductive aspects of the analysis consisted of analyzing a sample of the data and building a coding frame based on what was observed — in other words, the specific data helped shape our analytical approach. We made other coding related decisions deductively, based on previous research, for example, the decision to include an analysis on the presence of bots. We analyzed the most influential tweets and accounts, and compared the most influential content and users with random samples of each.

From the seven-day sample, we collected a total of 955,056 unique tweets, which included 188,026 (19.7 percent) original non-retweeted tweets and 767,030 (80.3 percent) retweets (including quoted and not-quoted retweets). The total number of unique user accounts was 366,520. Using a 95 percent confidence interval (±5), we first built two random samples each of n=385, one for user accounts and one for tweets. We then calculated the most retweeted tweets and the most retweeted users, building a dataset of n=385 for each. We created a separate dataset of the most retweeted user accounts rather than analyzing the accounts of the already collected most retweeted tweets because there were accounts with multiple entries in the top 385 retweeted tweets. In sum, four datasets were produced, each with n=385 items: 1) most retweeted tweets; 2) random tweets; 3) most retweeted user accounts; and, 4) random user accounts. See Figure 1. To provide some analysis on how bots and conspiracy-linked accounts tweeted about hydroxychloroquine, we applied the findings obtained from analyzing user accounts to the most retweeted tweets dataset. This analysis was possible because 99.5 percent of the user accounts in the most retweeted tweets (dataset 1) were present in the most retweeted user accounts (dataset 3). We also compiled a list of the most shared URLs across all tweets. See Figure 1.


Flowchart of methods for building datasets and conducting analysis
Figure 1: Flowchart of methods for building datasets and conducting analysis.


Our coding process involved first creating two separate coding frames, one for user accounts and one for tweets, which were modified based on a sample analysis of the data and discussions between the two separate coders who solidified the frame. See the supplementary materials (SM1, SM2) for both complete coding frames. Analysis focused on the Twitter data downloaded, but unclear tweets and each account were manually searched on Twitter to access information not included in the download (e.g., images in accounts, emojis in bios, etc.). We did not perform Google searches or carry out additional inquiries on user accounts to further contextualize their media presence, including whether they supported a particular political party. All non-English language tweets were translated using Google Translate. If the portrayal of hydroxychloroquine or the general context was too ambiguous for clear interpretation, we coded as “Unclear”. This categorization also applied to information presented in English language tweets, or information on user accounts such as, for example, unfamiliar hashtags or emojis. During coding, the two coders met regularly to discuss and clarify any flagged coding issues.

To assess levels of bot-like activity on user accounts, we used the Botometer program [], which is in keeping with methodologically similar research (Ferrara, et al., 2020). We are aware of the limitations that come with this program (Rauchfleisch and Kaiser, 2020), understanding that the tool can only provide an approximation on the number of bots. We calculated bot scores on all users in the user account datasets as well as on all user accounts present in the tweet datasets. Determining the presence of “conspiracy” or “click-bait” discourse was acknowledged to be a more subjective process whereby content was flagged for including rhetoric around government secrets, hidden truths, known COVID conspiracies (e.g., Bill Gates, 5G, plandemic, covid hoax, etc.) and phrases designed to incite clicks (e.g., “This article will SHOCK you,” “Read here what the government doesn’t want you to know,” etc.). See supplementary materials for examples of conspiratorial and click bait tweets (SM 3).

For the user account datasets, we noted user accounts with particular conspiracy traits potentially related to QAnon-linked conspiracy collectives. This included the presence of explicit QAnon reference (QAnon images, #Q, #QAnon, #WWG1WGA), references to former General Michael Flynn (Flynn images, 3 stars included in Twitter handles, etc.), and/or inclusions of links to Parler accounts or Parler identification markers (e.g., identification tags). We have grouped these characteristics together as links have been made between the QAnon conspiracy, Michael Flynn, far-right activists, and the social media platform Parler (Amarasingam and Argentino, 2020; Greenspan, 2020; Newhouse, 2020). We acknowledge the possibility that not all references to Michael Flynn or Parler are necessarily associated with QAnon. Having coded all the top user accounts with these traits, we searched for these accounts in the most retweeted tweets dataset to observe how the QAnon-linked accounts had been tweeting about hydroxychloroquine.




Data summary

There were 955,056 unique tweets captured between 21–28 April 2020, which included 188,026 (19.7 percent) original non-retweeted entries and 767,030 (80.3 percent) retweets (including quoted and not-quoted retweets). The total number of unique user accounts was 366,520. The compiled random samples of 385 tweets and 385 user accounts, constituted a confidence level of 95 percent (±5). The top 385 most retweeted tweets totaled 412,786 retweets, which counted for 43.2% of the total tweets. A total of 19 (4.9 percent) most retweeted tweets and 34 (8.8 percent) of random tweets came from suspended accounts or were tweets made unavailable either by Twitter (violating rules) or users (deleting their own tweets).




Overview summary of tweets

The tweets containing “hydroxychloroquine” from 21–28 April 2020 showed a polarized body of tweets, which often made explicit reference to political figures and parties. Then president Trump had a very large presence in the tweets and a clear trend was observed where tweets critical of Trump were critical of hydroxychloroquine and tweets supportive of Trump promoted hydroxychloroquine. Sometimes tweets made explicit reference to scientific trials in the top retweets (96, 24.9 percent) and the random sample (82, 21.6 percent), but the tweets seldom debated the scientific merit or characteristics of the studies. Instead, commentary typically focused on how others (media companies or individuals) portrayed the drug. Strictly hydroxychloroquine-focused debates centered on the studies’ rigor, or how hydroxychloroquine was tested without zinc. Further debates centered on the demographics of the patients being studied, namely the severity or stage of their illness. Some tweets had apparent conspiratorial elements. Evidence of considerable bot-like activity was also present. See Figure 2 for screenshots of the top 10 most retweeted tweets.


Top 10 most retweeted tweets
Figure 2: Top 10 most retweeted tweets with “hydroxychloroquine” from 21–28 April 2020.


Tweet specifics

In the most retweeted tweets, which focused most often on the American context, hydroxychloroquine was promoted and critiqued in almost equal proportions (36.4 percent and 42.3 percent respectively). In both cases of promoting and critiquing, however, top retweeted tweets did so explicitly and implicitly in almost equal numbers. By explicitly, we mean tweets that stated that the drug could or could not be used as a potentially effective treatment for COVID. By implicitly, we mean tweets that critiqued others’ discourse either promoting or critiquing the drug as a potentially efficacious treatment. Of the 163 tweets (42.3 percent) critiquing hydroxychloroquine, 136 (83.4 percent) contained the implicit critique of countering those promoting the drug, while 87 (53.4 percent) had an explicit critique. Similarly, of the 140 (36.4 percent) top retweets promoting hydroxychloroquine, 107 (76.4 percent) demonstrated implicit promotion by means of countering those critiquing the drug, while 69 (49.3 percent) had an explicit promotion. See Figure 3. These results demonstrate that it was more common to counter or combat how others talked about the drug than it was to tweet about the drug directly. This same finding was also evident in the random sample of tweets, but the random sample exhibited more neutrality around its portrayal of the drug (Table 1).


portrayal of hydroxychloroquine in the 385 most retweeted tweets
Figure 3: The portrayal of hydroxychloroquine in the 385 most retweeted tweets with “hydroxychloroquine” during 21–28 April 2020.



Complete coding of top retweets and random tweets
Table 1: Complete coding of top retweets (n=385) and random tweets (n=385).


President Trump featured often in the top retweets (168, 43.1 percent) and the random sample (99, 25.7 percent), and in both cases the tweets critiqued the President (71.8 percent and 64.6 percent respectively) considerably more often than supported him (12.7 percent and 8.1 percent respectively). In contrast, there were more tweets in the top retweets and random sample which critiqued the “left” (37, 9.6 percent and 19, 4.9 percent respectively) than the “right” (4, 1.0 percent) in both datasets (Table 1). There was an observable connection between the portrayal of the President and the portrayal of hydroxychloroquine. Of the 163 most retweeted tweets critiquing hydroxychloroquine 67.5 percent (110) mentioned Donald Trump. All tweets supportive of the former president portrayed hydroxychloroquine positively and approximately 94 percent of the tweets in each dataset which critiqued Trump critiqued the drug. Where the president was portrayed neutrally, the trend in both datasets was to promote hydroxychloroquine (Table 2). Mentioning and critiquing Trump was, therefore, a key component of critiquing hydroxychloroquine, and clear lines of polarization were drawn between perspectives on Trump and tweeting about hydroxychloroquine. See Figure 3 and Table 2.


Portrayal of hydroxychloroquine when Donald Trump is mentioned in top retweets and random sample
Table 2: Portrayal of hydroxychloroquine when Donald Trump is mentioned in top retweets (n=385) and random sample (n=385).


Analysis on the top five URL sources shared revealed that CNN, Fox, The Gateway Pundit, and “Memes/Other” featured in both datasets, whereas the New York Times featured in the top retweets and the Associated Press featured in the random sample (Table 1). The presence of The Gateway Pundit on both lists is a notable finding as it has been described by a qualitative fact-checking source NewsGuard as a “far-right political Web site that published false and misleading content ... regularly distorts information and occasionally spreads conspiracy theories” (NewsGuard, 2020). Of the most shared individual URLs in the entire dataset, the most shared URL (in 5,070 tweets) came from a source called Uncover DC [] and linked to an article titled “Doctors in Nevada Sue Sisolak Over Ban on Hydroxychloroquine” (Beanz, 2020). This article details how hydroxychloroquine can save lives and should be available for prescription. The second most shared URL (in 2798 tweets) came from the Washington Post, which linked to an article titled “Fox News hosts go mum on the Covid-19 drug they spent weeks promoting.” As suggested by the title, this article critiqued the promotion of hydroxychloroquine. The other URLs of the ten most-shared URLs include four sources with conspiratorial elements and more established mainstream media sources such as The Hill and CNN. See table in supplementary materials (SM4).

Tweets featuring anecdotes of individuals’ treatments more commonly featured anecdotes of efficacy than harm (Table 1). The most shared story was that of Dr. Rick Bright, whose case was used to critique the Trump administration’s handling of the pandemic, and to implicitly critique the effectiveness of hydroxychloroquine. Another commonly shared story was that of Karen Whitsett (see, for example, the top retweet in Figure 1), which was used to promote hydroxychloroquine’s effectiveness and to critique Democrats. Dr. Raoult’s study and Dr. Zev Zelenko’s treatment “success” story were used to promote the benefits of hydroxychloroquine explicitly. Governor Steve Sisolak was commonly critiqued in tweets for banning hydroxychloroquine, thus promoting hydroxychloroquine implicitly.

Assessing the presence of conspiratorial and click-bait tweets (see methods and supplementary materials for details) found that approximately seven percent of the tweets in each dataset were flagged for showing conspiracy characteristics (Table 1). Tweets with clickbait characteristics were almost non-existent in the random tweets dataset but present in approximately four percent of the most retweeted tweets. See Table 1 and supplementary materials for examples (SM3).

At least 15% of all user accounts in the top retweet dataset (39, 14.8 percent), and nearly 12 percent of the user accounts in the random sample dataset (51, 11.7 percent) had a Botometer score of 4.0 or higher. As detailed by Botometer’s Web site, 19 percent of accounts with a botscore above 3.9 are labeled as humans, meaning 81 percent are bots. These findings are reported as “at least” because some of the accounts could not be found using Botometer (Table 1).

Accounts with bot scores of 4.0 or greater typically promoted hydroxychloroquine, but the findings differed in each tweet dataset. In the most retweeted tweets, 31 (79.5 percent) of the 39 accounts with bot scores of 4.0 or greater promoted hydroxychloroquine, 6 (15.4 percent) portrayed the drug either neutrally or ambiguously, and 2 (5.1 percent) critiqued the drug. See Figure 3. In the dataset of random tweets, 24 (47.1 percent) of the 51 accounts with bot scores of 4.0 of greater promoted hydroxychloroquine, 16 (31.4 percent) portrayed the drug either neutrally or ambiguously, and 11 (21.6 percent) critiqued the drug.



Twitter user accounts

Overview summary of user accounts

Analysis on the two datasets of top retweeted user accounts (top users) and the random sample of users (random users) revealed many American-based users with a political affiliation on their accounts. Top users exhibited a high concentration in the geographic areas of Washington, D.C. and New York, and typically had professions linked with media/journalism. The presence of physician/MD accounts in either dataset was relatively low. Analysis also reveals a seemingly high number of accounts potentially linked with the QAnon conspiracy or the far right. A fairly large number of accounts demonstrated high levels of bot-like activity. There was a greater presence of accounts with bot-like activity and with QAnon traits in the top users than the random users.

User account specifics

Similar to what was observed in the analysis of tweets (Table 1), approximately five percent of the most frequently retweeted user accounts, and eight percent of the random user accounts could not be accessed. Approximately four percent of accounts in both datasets had suspended accounts at the time when analysis was conducted (Table 3). As expected, top users had a significantly higher average number of followers and a higher following-to-followers ratio than average users (Table 3). While the random users were considerably less likely to list a geographical location (56.9 percent versus 70.1 percent), both datasets show that users are primarily based in the U.S. (Table 3). Of note, of the 218 American locations listed for top users (representing 80.7 percent of all top users whose profiles’ listed locations), 44.0 percent were located in either Washington, D.C. or New York. Coupled with the findings that of the 253 (65.7 percent) of the top users listing professions, 60.5 percent described their professions as based in media or journalism. Professions were only listed on 93 (24.2 percent) of the random users and here media/journalism was also the most listed profession, albeit in only 17.2 percent of the cases (Table 2).


Complete coding of top user accounts and random sample of users
Table 3: Complete coding of top user accounts (n=385) and random sample of users (n=385).


Nearly 40 percent of top users and 30 percent of random users included a political affiliation on their account, evident in biographical information, account images or pinned tweets (Table 3). A range of political positions were observed including pro-Trump, pro-Biden, anti-Trump, and anti-Biden sentiments as well as specific labels such as “Conservative,” “Progressive,” “Republican,” “Democrat,” etc. This spectrum was also evident when analyzing hashtags included on user accounts, present on 87 (22.6 percent) of top users and 72 (18.7 percent) of the random users. Hashtags supporting and promoting Trump were most prevalent in both datasets, including the two most popular, #MAGA and #KAG, and hashtags containing commonly stated Trump phrases such as #Draintheswamp and #Buildthewall. Other user hashtags contained anti-Trump and/or pro-Biden sentiment such as #Biden2020, #BidenHarris2020, #ImpeachTrump, #resist etc. Some hashtags included reference to the QAnon conspiracy theory such as #QANON and #WWG1WGA. See supplementary materials for a table of the top hashtags (SM5).

Accounts with a Botometer score over 4.0 existed in both datasets, even though the majority of top users (207 (53.8 percent)) and random users (266 (69.1 percent)) had a Botometer score under 2.1. In each dataset, 60 (15.6 percent) of top users’ accounts and 27(7.0 percent) of random users had a Botometer score over 4.0. The total number of top users with Botomter scores over 3.0 was 115 (29.9 percent). The total number of random users with Botometer scores over 3.0 was 65 (16.9 percent).

Approximately four percent of top users (15 (3.9 percent) and random users (14, 3.6 percent)) showed explicit links to QAnon, including the use of “Q,” “QAnon,” or “#WWG1WGA.” An additional 32 (8.3 percent) of top users made reference to Michael Flynn, and 21(5.5 percent) had a link to a personal Parler account. The presence of Michael Flynn or Parler-linked accounts was very low for random users (Table 3). For the top users, 48 (12.5 percent) had one of either an explicit QAnon mention, a Michael Flynn reference or a link to or mention of a Parler account. For random users, there were a total of 18 (4.7 percent) accounts with at least one of these characteristics.

In total, 33 (68.8 percent) of the 48 QAnon-linked top user accounts’ tweets were accessible for analysis (many accounts had been suspended and/or had their tweets removed by Twitter). Of these 33 accounts, 25 (75.8 percent) promoted hydroxychloroquine, seven (21.2 percent) portrayed the drug neutrally, and in one case the portrayal was unclear. None of the tweets from these accounts critiqued the drug.



Discussion and conclusion

The Twitter discourse around hydroxychloroquine from 21–28 April 2020 was highly polarized and politicized. It also included bot-like activity and conspiratorial content. In a week when studies were published casting doubt on the effectiveness of hydroxychloroquine, there were similar numbers of tweets promoting the drug’s therapeutic potential as there were critiquing it. The large percentage of tweets that either promoted the drug or countered the message of those critiquing the drug was alarming. Similarly alarming was how often those critiquing the drug also politicized the emerging findings, often making explicit references to then president Trump. Findings showed collaborative efforts to promote hydroxychloroquine as an effort to defend and support the Trump and his earlier promotion of the drug. Findings also showed collaborative efforts to critique hydroxychloroquine as an effort to attack Trump and media companies such as Fox, who had promoted its use. Accounts with bot-like characteristics and far-right/conspiratorial group affiliations almost exclusively promoted hydroxychloroquine. At such an early stage in the pandemic, the discussions around whether a drug like hydroxychloroquine could help cure COVID-19 thus became highly polarized and infused with political and ideological positions. Very few of the discussions were (solely) focused on discussing scientific developments and clinical data, even among those critiquing hydroxychloroquine. In short, Twitter became battleground in which participants consistently wasted opportunities for clear and accurate science communication.

The findings from this study show almost equal numbers of those promoting and critiquing the drug. Here, the concept of “discourse coalitions,” which describes how disparate groups of individuals act in similar discursive patterns to drive particular agendas (Hajer, 1993), is highly applicable. Those continuing to promote hydroxychloroquine used all potential storylines to advance their cause. Despite the mounting evidence to the contrary, they deployed the controversial clinical findings from Dr. Raoult and Dr. “Zev” Zelenko, and aggressively sought to discredit Dr. Rick Bright for questioning hydroxychloroquine’s funding. These actors promoted Karen Whitsett’s ancedotral hydroxychloroquine success story and lambasted Governor Steve Sisolak, who placed a partial restriction on hydroxychloroquine in the state of Nevada. When critiquing the findings from the Veteran Affairs study (Magagnoli, et al., 2020), a common argument was that the study was flawed because researchers (intentionally) left out zinc in their administration of the drug. This same critique appeared on discredited, pseudoscience health sites such as NaturalNews (Huff, 2020). While identifying bots with precision remains a challenge (Rauchfleisch and Kaiser, 2020), it is notable that nearly all the accounts with bot-like activity and QAnon/far-right characteristics had tweeted in promotion and defense of hydroxychloroquine using many of the discursive strategies listed above. An illustrative example is the case of the most shared URL in the dataset from the fringe Web site Uncover DC, which detailed the lawsuit against Governor Sisolak for restricting access “to a drug that has shown the potential to save lives” (Beanz, 2020). Tracy Beanz, the author of this story and the editor of Uncover DC, has been linked to promoting the QAnon conspiracy (Alexander, 2020; Zadrozny and Collins, 2018). Conducting a comprehensive analysis on all URLs shared in the dataset was not within the scope of this project but would be a worthwhile endeavor for future analysis. The promotion of hydroxychloroquine through QAnon-linked accounts and conspiratorial content corroborates what was observed in research similar to ours (Blevins, et al., 2021). Also worth noting is the presence of accounts with bot-like activity in the random sample of tweets which were critiquing hydroxchloroquine. It is possible therefore, that some bots were deployed by both sides of the divided spectrum to promote specific agendas, but that those aligned with Trump, QAnon, and hydroxychloroquine promotion had a thicker network and wider retweet influence.

Those critiquing the effectiveness of hydroxychloroquine commonly did so by making explicit reference to study results and recommendations from federal health agencies, but also — and more often — by critiquing those who had earlier promoted the drug, especially Trump and Fox News. Indeed, of the 163 most retweeted tweets critiquing hydroxychloroquine’s potential effectiveness, approximately 70 percent included a critique of Trump. References to the case of Dr. Rick Bright sometimes attempted to implicitly demonstrate hydroxychloroquines ineffectiveness while also explicitly critiquing the Trump administration’s handling of the pandemic. The critiquing of hydroxychloroquine on Twitter was therefore also a highly politicized act, helping to polarize the potential for scientific discussions.

If sharp dividing lines were already present between individuals, their political party, and their particular sources of news or Twitter accounts, it can certainly be speculated that portraying scientific findings in this manner exacerbated the divisions of opinion with regards to the development of hydroxychloroquine facts. Indeed, research has shown how exposure to polarized perspectives may amplify polarization (Bail, et al., 2018; Druckman, et al., 2018), but this conclusion requires further research (Iyvengar, et al., 2019; Hart and Nisbet, 2012). Regardless of whether polarized content increases polarization, politicizing a health topic can have an impact on how people interpret and make decisions (Bridgman, et al., 2020; Puri, et al., 2020), especially in the context of COVID-19 where an abundance of misinformation has made for complex science communication (Mackey, et al., 2021).

It was unsurprising that the most active accounts and most retweeted tweets came from influential accounts with high numbers of followers. Interestingly, however, while only 70 percent of the top users listed a location, more than a third of these users were based in just two locations: Washington, D.C. and New York. Furthermore, while only 65 percent of top users listed professions, more than 60 percent of these accounts listed professions in media or journalism. In contrast, only 13 (3.4 percent) MD/Physician accounts featured in the top 385 user accounts. Because these 385 accounts accounted for more than half of all the retweets in the dataset, this demonstrates a relatively small group, of mostly media professionals, was having a huge retweet influence. Additionally, 16 percent of top user accounts had a botometer score over 4.0 and nearly 30 percent had a botometer score over 3.0, compared with only seven percent of random accounts having a botometer score over 4.0 and 17 percent having a botometer score over 3.0. It becomes apparent how, in Twitter conversations, accounts with a specific medical focus struggled to have widespread influence.

Twitter was evidently taking a somewhat active role in controlling the spread of questionable information as numerous accounts were suspended and tweets were removed. It is also now well known that Twitter has been more actively seeking to improve its regulatory practices regarding misinformation, efforts which eventually resulted in Trump being banned from the platform. (Reexamining our data months after analysis, we note that many more of the user accounts in our popular dataset have been suspended by Twitter.) However, could more have been done and done sooner? For example, tweets from accounts with bot characteristics and QAnon connections, almost uniformly promoted hydroxychloroquine, and were playing a large role in shaping the hydroxychloroquine Twitter discourse. Research has shown how the exposure to misinformation and conspiracy theories increases perceptions of accuracy (Pennycook, et al., 2018), influences behavior in a manner called “strategic sophistication” (Balafoutas, et al., 2021) and can also hinder public trust in science. It is hoped Twitter will be increasingly better equipped to deal with the spreading of misinformation in the future. It has been suggested that the platform could also bolster these causes by facilitating independent research efforts (Pasquetto, et al., 2020).

Malicious disinformation propagators strive to create information chaos where facts struggle for prominence and traction (Paul and Matthews, 2016; Pasquetto, et al., 2020; Illing, 2020). A growing body of research shows that a relatively small number of actors — including bots — can greatly shape the media landscape around a particular topic (Center for an Informed Public, et al., 2021; DiRiesta and Lotan, 2015; Satija and Sun, 2019; Allem, et al., 2020). But research also shows that the spread of misinformation is a complex phenomenon in which all actors, including social media platforms and the general public also play significant roles (Pasquetto, et al., 2020; Menczer and Hills, 2020). Researchers are exploring a range of different tools that can be used by both media platforms as well as the public to contest and quell the spread of misinformation online (Lewandowsky, et al., 2012; Pasquetto, et al., 2020; Pennycook, et al., 2021; Caulfield, 2020). Social media platforms need to be more assertively combating misinformation, but members of the public, especially those with substantial media influence, should also consider the best ways to transmit accurate science, foster productive discussions, and educate the public without exacerbating political tensions. End of article


About the authors

Alessandro Marcon is a Research Associate at the University of Alberta’s Health Law Institute (HLI), where Timothy Caulfield is the Research Director. The HLI is an internationally recognized centre for evidence based health law and science policy research. In addition to being the HLI’s Research Director, Law Professor Timothy Caulfield is a Canada Research Chair in Health Law and Policy, and a Fellow of the Royal Society of Canada and the Canadian Academy of Health Sciences.
Direct comments to: marcon [at] ualberta [dot] ca



The authors would like to thank the invaluable contributions and support from Allison Jandura, Philipp Klostermann, Mark Bieber, Darren Wagner, and Robyn Hyde-Lay.



The authors would like to thank the Canadian Institutes for Health Research, Alberta Innovates, the Ministry of Economic Development, Trade and Tourism, the Government of Alberta and the Government of Canada for their generous support of the following projects: (1) Coronavirus Outbreak: Mapping and Countering Misinformation; and, (2) Critical Thinking in the Digital Age: Countering Coronavirus Misinformation.


Conflicts of interest

The authors have no conflicts of interest to declare.



Alan Abramowitz and Jennifer McCoy, 2019. “United States: Racial resentment, negative partisanship, and polarization in Trump’s America,” Annals of the American Academy of Political and Social Science, volume 681, number 1, pp. 137–156.
doi:, accessed 8 September 2021.

Jon-Patrick Allem, Patricia Escobedo, and Likhit Dharmapuri, 2020. “Cannabis surveillance with Twitter data: Emerging topics and social bots,” American Journal of Public Health, volume 110, number 3, pp. 357–362.
doi:, accessed 8 September 2021.

Harriet Alexander, 2020. “QAnon posts were promoted by Russia-linked accounts as early as November 2017,” Independent (2 November), at conspiracy-theory-trump-2020-election-b1536946.html, accessed 10 August 2021.

Amarnath Amarasingam and Marc-André Argentino, 2020. “The QAnon conspiracy theory: A security threat in the making,” CTC Sentinel, volume 13, number 7, pp. 37–44, and at >, accessed 10 August 2021.

Abdolghader Assarroudi, Fatemeh Heshmati Nabavi, Mohammad Reza Armat, Abbas Ebadi, and Mojtaba Vaismoradi. 2018. “Directed qualitative content analysis: The description and elaboration of its underpinning methods and data analysis process,” Journal of Research in Nursing, volume 23, number 1, pp. 42–55.
doi:, accessed 8 September 2021.

Christopher A. Bail, Lisa P. Argyle, Taylor W. Brown, John P. Bumpus, Haohan Chen, M.B. Fallin Hunzaker, Jaemin Lee, Marcus Mann, Friedolin Merhout, and Alexander Volfovsky, 2018. “Exposure to opposing views on social media can increase political polarization,” Proceedings of the National Academy of Sciences volume 115, number 37 (11 September), pp. 9,216–9,221.
doi:, accessed 8 September 2021.

Loukas Balafoutas, Alexander Libman, Vasileios Selamis, and Björn Vollan, 2021. “Exposure to conspiracy theories in the lab,” Economic and Political Studies volume 9, number 1, pp. 90–112.
doi:, accessed 8 September 2021.

Pablo Barberá, 2015. “How social media reduces mass political polarization. Evidence from Germany, Spain, and the U.S.,” at, accessed 8 September 2021.

Tracy Beanz, 2020. “Doctors in Nevada sue Sisolak over ban on hydroxychloroquine,” Uncover DC (25 April), at, accessed 8 September 2021.

Alessandro Bessi, Fabio Petroni, Michela Del Vicario, Fabiana Zollo, Aris Anagnostopoulos, Antonio Scala, Guido Caldarelli, and Walter Quattrociocchi, 2016. “Homophily and polarization in the age of misinformation,” European Physical Journal Special Topics volume 225, number 10, pp. 2,047–2,059.
doi:, accessed 8 September 2021.

Jeffrey Layne Blevins, Ezra Edgerton, Don P. Jason, and James Jaehoon Lee, 2021. “Shouting into the wind: Medical science versus ‘BS’ in the Twitter maelstrom of politics and misinformation about hydroxychloroquine,” Social Media + Society (23 June).
doi:, accessed 8 September 2021.

Hilary Boudet, Leanne Giordono, Chad Zanocco, Hannah Satein, and Hannah Whitley, 2020. “Event attribution and partisanship shape local discussion of climate change after extreme weather,” Nature Climate Change, volume 10, number 1, pp. 69–76.
doi:, accessed 8 September 2021.

Levi Boxell, Matthew Gentzkow, and Jesse M. Shapiro, 2017. “Greater Internet use is not associated with faster growth in political polarization among US demographic groups,” Proceedings of the National Academy of Sciences, volume 114, number 40 (3 October), pp. 10,612–10,617.
doi:, accessed 8 September 2021.

Aengus Bridgman, Eric Merkley, Peter John Loewen, Taylor Owen, Derek Ruths, Lisa Teichmann, and Oleg Zhilin, 2020. “The causes and consequences of COVID-19 misperceptions: Understanding the role of news and social media,” Harvard Kennedy School Misinformation Review (18 June).
doi:, accessed 8 September 2021.

David A. Broniatowski, Amelia M. Jamison, SiHua Qi, Lulwah AlKulaib, Tao Chen, Adrian Benton, Sandra C. Quinn, and Mark Dredze, 2018. “Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate,” American Journal of Public Health, volume 108, number 10, pp. 1,378–1,384.
doi:, accessed 8 September 2021.

Thomas Carothers and Andrew O’Donahue, 2019. “How to understand the global spread of political polarization,” Carnegie Endowment for International Peace (1 October), at, accessed 8 September 2021.

Timothy Caulfield, 2020. “Does debunking work? Correcting COVID-19 misinformation on social media,” In: Colleen Flood, Vanessa MacDonnell, Jane Philpott, Sophie Thériault, and Sridhar Venkatapuram (editors). Vulnerable: The law, policy and ethics of COVID-19. Ottawa: University of Ottawa Press, pp. 183–200, and at, accessed 8 September 2021.

Center for an Informed Public, Digital Forensic Research Lab, Graphika, and Stanford Internet Observatory, 2021. “The long fuse: Misiformation and the 2020 election” (3 March), at, accessed 8 September 2021.

Kaiping Chen, Anfan Chen, Jingwen Zhang, Jingbo Meng, and Cuihua Shen, 2020. “Conspiracy and debunking narratives about COVID-19 origination on Chinese social media: How it started and who is to blame,” Harvard Kennedy School (HKS) Misinformation Review (10 December).
doi:, accessed 8 September 2021.

Kirsten Cornelson and Boriana Miloucheva, 2020. “Political polarization, social fragmentation, and cooperation during a pandemic,” University of Toronto, Department of Economics, Working Paper, number 663 (7 April), at, accessed 8 September 2021.

Datareportal, 2021. “Global social media stats,” at, accessed 10 August 2021.

Jeff Diamant, 2020. “Three-in-ten or more Democrats and Republicans don’t agree with their party on abortion,” Pew Research Center (18 June), at, accessed 10 August 2021.

Renee DiResta and Gilad Lotan, 2015. “Anti-vaxxers are using Twitter to manipulate a vaccine bill,” Wired (8 June), at, accessed 10 August 2021.

James N. Druckman, Matthew S. Levendusky, and Audrey McLain, 2018. “No need to watch: How the effects of partisan media can spread via interpersonal discussions,” American Journal of Political Science, volume 62, number 1, pp. 99–112.
doi:, accessed 8 September 2021.

Li Du, Christen Rachul, Zhaochen Guo, and Timothy Caulfield, 2016. “Gordie Howe’s ‘miraculous treatment’: Case study of Twitter users’ reactions to a sport celebrity’s stem cell treatment,” JMIR Public Health and Surveillance, volume 2, number 1. e8.
doi:, accessed 8 September 2021.

Sarah Evanega, Mark Lynas, Jordan Adams, and Karinne Smolenyak, 2020. “Coronavirus misinformation: Quantifying sources and themes in the COVID-19 ‘infodemic’,” at, accessed 10 August 2021.

Emilio Ferrara, Herbert Chang, Emily Chen, Goran Muric, and Jaimin Patel, 2020. “Characterizing social media manipulation in the 2020 U.S. presidential election,” First Monday, volume 25, number 11. at, accessed 10 August 2021.
doi:, accessed 8 September 2021.

Philippe Gautret, Jean-Christophe Lagier, Philippe Parola, Van Thuan Hoang, Line Meddeb, Morgane Mailhe, Barbara Doudier, Johan Courjone, Valérie Giordanengo, Vera Esteves Vieira, Hervé Tissot Dupont, Stéphane Honoré, Philippe Colson, Eric Chabrière, Bernard La Scola, Jean-Marc Rolain, Philippe Brouqui, Didier Raoult, 2020. “Hydroxychloroquine and azithromycin as a treatment of COVID-19: Results of an open-label non-randomized clinical trial,” International Journal of Antimicrobial Agents, volume 56, number 1, 105949.
doi:, accessed 8 September 2021.

Noam Gidron, James Adams, and Will Horne, 2019. “Toward a comparative research agenda on affective polarization in mass publics,” APSA Comparative Politics Newsletter, volume 29, number 1, pp. 30–36, and at, accessed 8 September 2021.

Rachel E. Greenspan, 2020. “What happens to QAnon after Trump leaves the White House,” at Insider (10 November), at defeat-plan-2020-11, accessed 10 August 2021.

Maarten A. Hajer, 1993. “Discourse coalitions and the institutionalization of practice: The case of acid rain in Great Britain,” In: Frank Fischer and John Forester (editors). The argumentative turn in policy analysis and planning. Durham, N.C.: Duke University Press, pp. 43–76.
doi:, accessed 8 September 2021.

Sol P. Hart and Erik C. Nisbet, 2012. “Boomerang effects in science communication: How motivated reasoning and identity cues amplify opinion polarization about climate mitigation policies,” Communication Research volume 39, number 6, pp. 701–723.
doi:, accessed 8 September 2021.

Gordon Heltzel and Kristin Laurin, 2020. “Polarization in America: Two possible futures,” Current Opinion in Behavioral Sciences, volume 34, pp. 179–184.
doi:, accessed 8 September 2021.

John Henley and Niamh McIntyre, 2020. “Survey uncovers widespread belief in ‘dangerous’ covid conspiracy theories,” Guardian (26 October), at, accessed 10 August 2021.

Matthew J. Hornsey, Matthew Finlayson, Gabrielle Chatwood, and Christopher T. Begeny, 2020. “Donald Trump and vaccination: The effect of political identity, conspiracist ideation and presidential tweets on vaccine hesitancy,” Journal of Experimental Social Psychology, volume 88, 103947.
doi:, accessed 8 September 2021.

Ethan Huff, 2020. “Study claims hydroxychloroquine doesnt work against coronavirus but it intentionally left out zinc” (27 April), at, accessed 10 August 2021.

Sean Illing, 2020. “‘Flood the zone with shit’: How misinformation overwhelmed our democracy,” Vox (6 February), at, accessed 10 August 2021.

Shanto Iyengar, Yphtach Lelkes, Matthew Levendusky, Neil Malhotra, and Sean J. Westwood, 2019. “The origins and consequences of affective polarization in the United States,” Annual Review of Political Science, volume 22, pp. 129–146.
doi:, accessed 8 September 2021.

Annice E. Kim, Robert Chew, Michael Wenger, Margaret Cress, Thomas Bukowski, Matthew Farrelly, and Elizabeth Hair, 2019. “Estimated ages of JUUL Twitter followers,” JAMA Pediatrics, volume 173, number 7, pp. 690–692.
doi:, accessed 8 September 2021.

Carl A. Latkin, Lauren Dayton, Grace Yi, Arianna Konstantopoulos, and Basmattee Boodram, 2021. “Trust in a COVID-19 vaccine in the US: A social-ecological perspective,” Social Science & Medicine volume 270, 113684.
doi:, accessed 8 September 2021.

Changjun Lee, Jieun Shin, and Ahreum Hong, 2018. “Does social media use really make people politically polarized? Direct and indirect effects of social media use on political polarization in South Korea,” Telematics and Informatics, volume 35, number 1, pp. 245–254.
doi:, accessed 8 September 2021.

Jeffrey Lees and Mina Cikara, 2021. “Understanding and combating misperceived polarization,” Philosophical Transactions of the Royal Society B, volume 376, number 1822 (12 April), 20200143.
doi:, accessed 8 September 2021.

Stephan Lewandowsky, Ullrich K.H. Ecker, Colleen M. Seifert, Norbert Schwarz, and John Cook, 2012. “Misinformation and its correction: Continued influence and successful debiasing,” Psychological Science in the Public Interest, volume 13, number 3, pp. 106–131.
doi:, accessed 8 September 2021.

Michael Liu, Theodore L. Caputi, Mark Dredze, Aaron S. Kesselheim, and John W. Ayers, 2020. “Internet searches for unproven COVID-19 therapies in the United States,” JAMA Internal Medicine, volume 180, number 8, pp. 1,116–1,118.
doi:, accessed 8 September 2021.

Ryan Mac and Jane Lytvynenko, 2020. “Facebook has banned QANON” (6 October), at, accessed 10 August 2021.

Tim K. Mackey, Vidya Purushothaman, Michael Haupt, Matthew C. Nali, and Jiawei Li, 2021. “Application of unsupervised machine learning to identify and characterise hydroxychloroquine misinformation on Twitter,” Lancet Digital Health, volume 3, number 2, pp. e72–e75.
doi:, accessed 8 September 2021.

Joseph Magagnoli, Siddharth Narendran, Felipe Pereira, Tammy H. Cummings, James W. Hardin, S. Scott Sutton, and Jayakrishna Ambati, 2020. “Outcomes of hydroxychloroquine usage in United States veterans hospitalized with Covid-19,” Med, volume 1, number 1, pp. 114–127.
doi:, accessed 8 September 2021.

Alessandro R. Marcon, Mark Bieber, and Timothy Caulfield, 2018. “Representing a ‘revolution’: How the popular press has portrayed personalized medicine,” Genetics in Medicine, volume 20, number 9, pp. 950–956.
doi:, accessed 8 September 2021.

Alessandro R. Marcon, Philip Klostermann, and Timothy Caulfield, 2016. “Chiropractic and spinal manipulation therapy on Twitter: Case study examining the presence of critiques and debates,” JMIR Public Health and Surveillance volume 2,number 2, e153.
doi:, accessed 8 September 2021.

Gregory J. Martin and Joshua McCrain, 2019. “Local news and national politics,” American Political Science Review, volume 113, number 2, pp. 372–384.
doi:, accessed 8 September 2021.

Aaron M. McCright and Riley E. Dunlap, 2011. “The politicization of climate change and polarization in the American public’s views of global warming, 2001–2010,” Sociological Quarterly, volume 52, number 2, pp. 155–194.
doi:, accessed 8 September 2021.

Filippo Menczer and Thomas Hills, 2020. “Information overload helps fake news spread and social media knows it,” Scientific American (1 December), at, accessed 10 August 2021.

Alessandro Nai and Jürgen Maier, 2021. “Can anyone be objective about Donald Trump? Assessing the personality of political figures,” Journal of Elections, Public Opinion and Parties, volume 31, number 3, pp. 283–308.
doi:, accessed 8 September 2021.

Alex Newhouse, 2020. “Parler is bringing together mainstream conservatives, anti-Semites and white supremacists as the social media platform attracts millions of Trump supporters” (27 November), at conservatives-anti-semites-and-white-supremacists-as-the-social-media-platform-attracts- millions-of-trump-supporters-150439, accessed 10 August 2021.

NewsGuard, 2020. “ profile,” at, accessed 10 August 2021.

Cailin O’Connor and James Owen Weatherall, 2020. “Hydroxychloroquine and the political polarization of hydroxychloroquine,” Boston Review (4 May), at, accessed 10 August 2021.

Cailin O’Connor and James Owen Weatherall, 2018. “Scientific polarization,” European Journal for Philosophy of Science, volume 8, number 3, pp. 855–875.
doi:, accessed 8 September 2021.

Joe Palca, 2020. “NIH panel recommends against drug combination promoted by Trump for COVID-19,” NPR (21 April), at, accessed 10 August 2021.

Irene V. Pasquetto, Briony Swire-Thompson, Michelle A. Amazeen, Fabrcio Benevenuto, Nadia M. Brashier, Robert M. Bond, Lia C. Bozarth, Ceren Budak, Ullrich K.H. Ecker, Lisa K. Fazio, Emilio Ferrara, Andrew J. Flanagin, Alessandro Flammini, Deen Freelon, Nir Grinberg, Ralph Hertwig, Kathleen Hall Jamieson, Kenneth Joseph, Jason J. Jones, R. Kelly Garrett, Daniel Kreiss, Shannon McGregor, Jasmine McNealy, Drew Margolin, Alice Marwick, FiIippo Menczer, Miriam J. Metzger, Seungahn Nah, Stephan Lewandowsky, Philipp Lorenz-Spreen, Pablo Ortellado, Gordon Pennycook, Ethan Porter, David G. Rand, Ronald E. Robertson, Francesca Tripodi, Soroush Vosoughi, Chris Vargo, Onur Varol, Brian E. Weeks, John Wihbey, Thomas J. Wood, and Kai-Cheng Yang, 2020. “Tackling misinformation: What researchers could do with social media data,” Harvard Kennedy School Misinformation Review.
doi:, accessed 8 September 2021.

Christopher Paul and Miriam Matthews, 2016. “The Russian ‘firehose of falsehood’ propaganda model,” RAND Corporation, PE-198-OSD.
doi:, accessed 8 September 2021.

Gordon Pennycook, Tyrone D. Cannon, and David G. Rand, 2018. “Prior exposure increases perceived accuracy of fake news,” Journal of Experimental Psychology: General, volume 147, number 12, pp. 1,865–1,880.
doi:, accessed 8 September 2021.

Gordon Pennycook, Ziv Epstein, Mohsen Mosleh, Antonio A. Arechar, Dean Eckles, and David G. Rand, 2021. “Shifting attention to accuracy can reduce misinformation online,” Nature, volume 592, number 7855 (22 April), pp. 590–595.
doi:, accessed 8 September 2021.

Neha Puri, Eric A. Coomes, Hourmazd Haghbayan, and Keith Gunaratne, 2020. “Social media and vaccine hesitancy: New updates for the era of COVID-19 and globalized infectious diseases,” Human Vaccines & Immunotherapeutics, volume 16, number 11, pp. 2,586–2,593.
doi:, accessed 8 September 2021.

Emma K. Quinn, Sajjad S. Fazel, and Cheryl E. Peters, 2020. “The Instagram infodemic: Cobranding of conspiracy theories, coronavirus disease 2019 and authority-questioning beliefs,” Cyberpsychology, Behavior, and Social Networking, volume 24, number 8, pp. 573–577.
doi:, accessed 8 September 2021.

Adrian Rauchfleisch and Jonas Kaiser, 2020. “The false positive problem of automatic bot detection in social science research,” PloS ONE, volume 15, number 10 (22 October), e0241045.
doi:, accessed 8 September 2021.

Roderik Rekker, 2021. “The nature and origins of political polarization over science,” Public Understanding of Science, volume 30, number 4, pp. 352–368.
doi:, accessed 8 September 2021.

Jon C. Rogowski and Joseph L. Sutherland, 2016. “How ideology fuels affective polarization,” Political Behavior, volume 38, number 2, pp. 485–508.
doi:, accessed 8 September 2021.

Frits R. Rosendaal, 2020. “Review of: ‘Hydroxychloroquine and azithromycin as a treatment of COVID-19: results of an open-label non-randomized clinical trial’ Gautret et al 2010, DOI:10.1016/j.ijantimicag.2020.105949,” International Journal of Antimicrobial Agents, volume 56, number 1, 106063.
doi:, accessed 8 September 2021.

Michael S. Saag, 2020. “Misguided use of hydroxychloroquine for COVID-19: The infusion of politics into science,” Journal of the American Medical Association, volume 324, number 21, pp. 2,161–2,162.
doi:, accessed 8 September 2021.

Neena Satija and Lena H. Sun, 2019. “A major funder of the anti-vaccine movement has made millions selling natural health products,” Washington Post (15 October), at, accessed 10 August 2021.

Rob Savillo, 2020. “Over three days this week, Fox News promoted an antimalarial drug treatment for coronavirus over 100 times” (27 March), at, accessed 10 August 2021.

Ana Luca Schmidt, Fabiana Zollo, Antonio Scala, Cornelia Betsch, and Walter Quattrociocchi, 2018. “Polarization of the vaccination debate on Facebook,” Vaccine volume 36, number 25, pp. 3,606–3,612.
doi:, accessed 8 September 2021.

Chengcheng Shao, Giovanni Luca Ciampaglia, Onur Varol, Kai-Cheng Yang, Alessandro Flammini, and Filippo Menczer, 2018. “The spread of low-credibility content by social bots,” Nature Communications, volume 9, number 1 (20 November), article number 4787.
doi:, accessed 8 September 2021.

Andrew Solender, 2020. “All the times Trump has promoted hydroxychloroquine,” Forbes (22 May), at, accessed 10 August 2021.

John Somberg, 2020. “Science, politics and hydroxychloroquine,” Cardiology Research, volume 11, number 5, pp. 267–268.
doi:, accessed 8 September 2021.

Kate Starbird, 2017. “Examining the alternative media ecosystem through the production of alternative narratives of mass shooting events on Twitter,” Proceedings of the International AAAI Conference on Web and Social Media, volume 11, number 1, pp. 230–239, and at, accessed 8 September 2021.

Will Stone, 2020. “Politics around hydroxychloroquine hamper science,” NPR (21 May), at, accessed 10 August 2021.

Joshua A. Tucker, Andrew Guess, Pablo Barberá, Cristian Vaccari, Alexandra Siegel, Sergey Sanovich, Denis Stukal, and Brendan Nyhan, 2018. “Social media, political polarization, and political disinformation: A review of the scientific literature,” Hewlett Foundation, at, accessed 8 September 2021.

Dominic Arjuna Ugarte, William G. Cumberland, Lidia Flores, and Sean D. Young, 2021. “Public attitudes about COVID-19 in response to president Trump’s social media posts,” JAMA Network Open volume 4, number 2, e210101.
doi:, accessed 8 September 2021.

U.S. Food & Drug Administration, 2020. “Coronavirus (COVID-19) update: FDA reiterates importance of close patient supervision for ‘off-label’ use of antimalarial drugs to mitigate known risks, including heart rhythm problems” (24 April), at, accessed 10 August 2021.

Muthiah Vaduganathan, Jeroen van Meijgaard, Mandeep R. Mehra, Jacob Joseph, Christopher J. O’Donnell, and Haider J. Warraich, 2020. “Prescription fill patterns for commonly used drugs during the COVID-19 pandemic in the United States” Journal of the American Medical Association, volume 323, number 24, pp. 2,524–2,526.
doi:, accessed 8 September 2021.

Jay J. Van Bavel, Katherine Baicker, Paulo S. Boggio, Valerio Capraro, Aleksandra Cichocka, Mina Cikara, Molly J. Crockett, Alia J. Crum, Karen M. Douglas, James N. Druckman, John Drury, Oeindrila Dube, Naomi Ellemers, Eli J. Finkel, James H. Fowler, Michele Gelfand, Shihui Han, S. Alexander Haslam, Jolanda Jetten, Shinobu Kitayama, Dean Mobbs, Lucy E. Napper, Dominic J. Packer, Gordon Pennycook, Ellen Peters, Richard E. Petty, David G. Rand, Stephen D. Reicher, Simone Schnall, Azim Shariff, Linda J. Skitka, Sandra Susan Smith, Cass R. Sunstein, Nassim Tabri, Joshua A. Tucker, Sander van der Linden, Paul van Lange, Kim A. Weeden, Michael J. A. Wohl, Jamil Zaki, Sean R. Zion, and Robb Willer, 2020. “Using social and behavioural science to support COVID-19 pandemic response,” Nature Human Behaviour, volume 4, number 5, pp. 460–471.
doi:, accessed 8 September 2021.

Andreas Voss, Geoffrey Coombs, Serhat Unal, Raphael Saginur, and Po-Ren Hsueh, 2020. “Publishing in face of the COVID-19 pandemic,” International Journal of Antimicrobial Agents, volume 56, number 1, 106081.
doi:, accessed 8 September 2021.

Yuxi Wang, Martin McKee, Aleksandra Torbica, and David Stuckler, 2019. “Systematic literature review on the spread of health-related misinformation on social media,” Social Science & Medicine, volume 240, 112552.
doi:, accessed 8 September 2021.

Stefan Wojcik, 2018. “5 things to know about bots on Twitter,” Pew Research Center (9 April), at, accessed 10 August 2021.

Sarita Yardi and danah boyd, 2010. “Dynamic debates: An analysis of group polarization over time on Twitter,” Bulletin of Science, Technology & Society volume 30, number 5, pp. 316–327.
doi:, accessed 8 September 2021.

Brandy Zadrozny and Ben Collins, 2018. “How three conspiracy theorists took ‘Q’ and sparked Qanon,” NBC News (14 August), at, accessed 10 August 2021.

Ethan Zuckerman, 2019. “QAnon and the emergence of the unreal,” Journal of Design and Science, number 6.
doi:, accessed 8 September 2021.


Supplementary materials

SM 1: One week of hydroxychloroquine tweets//Coding frame 1: Tweets


User name:* [filled in with metadata]
*user name will be searched to see if information is already obtained on user, if info has not been obtained this analysis can be performed if decided that it’s required (second round)

Tweet description:

1. Language: (English = E, NE= Not English) * leave NE on first round
2. https presence (Y/N)
   2b. If Y, code. If N, leave.
3. Country context (fill in, or N/A)
4. Country comparison (Y/N)
   4b. If Y, [list separate countries with semicolon]
5. RT (Y/N)
   5b. IF Y, [list publication source]
6. Overall Promotional, Critique or Neutral portrayal of Hydroxychloroquine (P = PROMO, C = Critique, N = Neutral)
7. Reference to “research, studies, trials” (Y/N)
   7b. If Y, [list details]
8. Counter promo: critiquing an expression of hydroxychloroquine promo (Y/N)
9. Counter critique: critiquing an expression of hydroxychloroquine critique (Y/N)
10. Specific cases of efficacy or harm related to individuals (Y/N): “E” = efficacy or “H” = harm or “N” = No
   10b. If Y, [list name]
11. Mention of Trump (Y/N)
   11b. If Y, support for Trump (S), critique of Trump (C), or neutral (N)
12. Mention of other national leaders (Y/N)
   12b. If Y, [list]
13. Presence of secrecy-related or conspiracy rhetoric (Y/N) (i.e., truth being silenced, whistleblowing, cover-ups, etc.)*? If Y, “Y + details”
14. Click-bate rhetoric — not a description of content but enticing text to generate clicks* (i.e., “This will change your mind”; “Watch this!”, etc.) If Y, “Y + details”
15. Critiquing liberals, the left, democrats, etc.? (Y/N)
16. Critiquing conservatives, the right, republicans, etc.? (Y/N)
      *soft category just to summarize the presence

SM 2: One week of hydroxychloroquine tweets//Coding frame 2: Users (Twitter accounts)


Twitter Name: (copy and paste) Followers: (list #)
Following: (list #) Location: (N/A if not listed) Verified: Y/N
Date joined: (fill in)

Account description:

1. Individual, Institution or Unclear
2. Related to profession? (Y/N)
   2b. If Y, which (Journalism, Media, Professor, Entertainer, Scientist, Politician, Author, Musician etc.)
3. Company affiliations (Y/N)
   3b. If Y, [list]
4. Political affiliation/ leaning (Y/N)
   4b. If Y, which (i.e., Party, or left, right, pro-Trump, Anti-Trump etc.)?
   4c. Indication of 4b (text, image, Google search (i.e., for Congress person))
5. Accreditation presented (Y/N) (i.e., MD, Dr., etc.)
   5b. If Y, which? [list]
6. Hashtag presence (Y/N)
   6b. If Y, which? [list]
7. Additional URLs or social media links? (Y/N)
   7b. If Y, which? [list]
   7c. Short description if needed
8. Sex (Male, Female, Other)
9. Visual Ethnicity (White, POC)
10. Identity markers of sexuality or gender (Y/N)
   10b. If Y, which [list]
11. Additional identity markers (Y/N)
   11b. If Y, [describe which]
12. Top image description (brief)
13. Pinned Tweet (Y/N)
   13b. If Y, hydrox related? (Y/N)
14. Language related to conspiracies* (i.e. hidden truths, exposing truth, cover-up, etc.)       *soft category to gauge what’s in there

SM 3: Five examples of tweets flagged as conspiracy related

“Why is the MSM suddenly in hardcore anti-hydroxychloroquine mode? HCQ destroys the narrative? Very obviously coordinated!”

“Fake News Attack Based on VA ‘Study’ Debunked: — They gave Hydroxychloroquine to elderly with severe diseases including cancer and HIV/AIDS — Co-author has been paid by competitor Gilead — Did not use ZINC!!” [Link shared: media-uses-va-study-to-launch-easily-debunked-attack-on-hydroxychloroquine/]

“The Dark Truth of Why They Are Attacking Hydroxychloroquine”

“TO ALL DECENT AND MORAL PEOPLE — all the hydroxychloroquine trials published are designed to fail.”

“So why did hydroxychloroquine turn into this witches brew of death all of a sudden when it’s been used successfully for so many years? Because that shut down any chance of massive worldwide vaccine dosing. Not to mention tagging of every human being on the face of the planet”

SM 3: Five examples of tweets flagged as “clickbait”

“DEVASTATING! Renowned French Dr. Didier Raoult DESTROYS Liberal Trump-Hating Media on VA Junk Report on Hydroxychloroquine!”

“TRUTH BOMB! Please take a moment to digest this! We are being manipulated and galvanized into fear by statistics!”

“BREAKING RETWEET! Read the below TRUMP TWEET again & again! Trump KNEW the standard of care! More importantly, Trump’s conduct has greatly exceeded the conduct to prove NEGLIGENT HOMICIDE! We will be supplementing for the ICJ as well! Bleach? Disinfectant? UV? Hydroxychloroquine?”

“Read MUCH more here:”

“The Dark Truth of Why They Are Attacking Hydroxychloroquine”

SM 4


10 most shared URLs in entire data set of tweets


SM 5


Most used hashtags in Twitter User Profiles



Editorial history

Received 10 May 2021; revised 18 August 2021; accepted 8 September 2021.

Creative Commons License
This paper is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

The Hydroxychloroquine Twitter War: A case study examining polarization in science communication
by Alessandro R. Marcon and Timothy Caulfield.
First Monday, Volume 26, Number 10 - 4 October 2021