First Monday

Editors, sources and the go back button: Wikipedias framework for beating misinformation by Bunty Avieson



Abstract
The COVID-19 pandemic highlighted the challenges and deadly consequences of misinformation circulating across digital platforms. Wikipedia is emerging from that communication crisis as both an effective information site, as well as offering wider lessons for the Internet age derived from the editorial structure that it has created. While Facebook, YouTube and Twitter struggled during the pandemic to contain the spread of misinformation, Wikipedia has shown itself to be a nimble, independent publisher, able to block erroneous content and provide rigorous health information, referenced to credible sources. The ‘anyone-can-edit’ site, is powered by a global community of volunteers, who collectively determine its policies and content, as well as policing the site. In the pandemic’s first year, 97,000 Wikipedia editors collaborated on 6,950 COVID-19 related articles in 188 languages, which were read more than 653 million times. This paper investigates the editorial framework developed by the Wikipedia community, and identifies three key factors as proving successful in the fight against medical misinformation in a global pandemic — the editors, their sources and the technological affordances of the platform.

Contents

Introduction
The platform — How Wikipedia works
Wikipedia’s specialist medical care
Wikipedia covers COVID-19
Why Wikipedia is succeeding against the infodemic
Conclusion

 


 

Introduction

On 30 January 2020, the World Health Organisation (WHO) declared COVID-19 to be a Public Health Emergency of International Concern. At that point, China had 7,711 confirmed cases, 12,167 suspected cases and 170 deaths, while a further 83 cases were reported in 18 other countries (WHO, 2020). Just two weeks later, on 15 February 2020, WHO Director-General Tedros Ghebreyesus referred to the emergence of a parallel “infodemic”. He explained: “Fake news spreads faster and more easily than this virus, and is just as dangerous” (Ghebreyesus, 2020).

The term ‘infodemiology’ was coined in 2002, in an American Journal of Medicine editorial and defined as: “... areas where there is a knowledge translation gap between best evidence (what some experts know) and practice (what most people do or believe), as well as markers for ‘high-quality’ information” (Eysenbach, 2002). A year later a journalist at the Washington Post wrote that an “infodemic” was emerging alongside the epidemic of Severe Acute Respiratory Syndrome (SARS), whereby “a few facts, mixed with fear, speculation and rumor, amplified and relayed swiftly worldwide by modern information technologies’ affect economies, politics and security” (Rothkopf, 2003). The term largely disappeared from public discourse until WHO used it in this pandemic to emphasise the scale of the communications challenges appearing alongside the COVID-19 pandemic (Simon and Camargo 2021).

The infodemic narrative, while compelling, should be treated with some caution. At its most dogmatic, it perpetuates an assumption that human behaviours, such as vaccine hesitancy, are primarily driven by exposure to false information and, further, that this can be countered by ‘inoculating’ the public with facts. While it is generally recognised that media content can affect audience attitudes and behaviour, what is less quantifiable is the degree of influence. Simon and Camargo (2021) identify two ways that the epidemiological comparison can be counter productive in health communications. First, that biological epidemics have a single, well-defined cause — a virus whose strains can be identified, sequenced and traced to their origins, whereas information spreads via various sources, with varied quality, context-dependent interpretation, and its origins often are unclear and untraceable. Second, they challenge the idea that information can be infectious, capable of being spread unwittingly by carriers. Further criticism of the term “infodemic” as a metaphoric framework, is that it shifts the responsibilities of health communications away from the government and medical communities (Venkataramakrishnan, 2020; Simon and Camargo, 2021).

While the infodemic discourse has its limitations, it can be a useful metaphor to characterise the spread of information via human networks acoss digital media. As the Reuters Institute notes, the term helps to draw attention to the important role of false information during the pandemic, and for this reason it has been used in thousands of studies across different disciplines (Nielsen, et al., 2021).

Misinformation is recognised as false information which is shared, but without ill-intent, and disinformation is false information that is deliberately created and shared to cause harm [1]. The effect of online misinformation in compounding a health crisis had been well documented before the appearance of COVID-19. During the 2014 Ebola outbreak in West Africa, research showed that online health misinformation added to the death toll (Allgaier and Svalastog, 2015) while misinformation about Ebola circulated via social media platforms created unfounded panic in the U.S. (Luckerson, 2014; Feuer 2015). Local interactions can have global consequences when information intended for intimate networks become part of international structures of proven and unproven beliefs (Kleineberg and Boguñá, 2016). A study by Hansson, et al. (2021) identified that misinformation in a health crisis can make people more vulnerable in six ways: 1) by discouraging appropriate protective action; 2) by promoting the use of false remedies; 3) by misrepresenting how the virus is transmitted; 4) by downplaying risks related to the pandemic; 5) by tricking people into buying fake protection; and 6) by victimising alleged spreaders of the virus.

Early research into the spread of COVID-19 misinformation showed online media platforms Facebook, Twitter and YouTube played a central role (Allington, et al., 2020; Apuke and Omar, 2021; Zarocostas, 2020; Zeng and Chan, 2021; Bheekhun, et al., 2021). Research in March 2020 found that more than a quarter of the most watched YouTube videos on COVID-19 contained misleading information, reaching a global audience of millions (Li, et al., 2020). Where previously Facebook, Twitter and YouTube have resisted attempts to be accountable for the content they carried, arguing they were platforms not publishers, the global pandemic brought their public responsibilities to the fore. As the pandemic has progressed, pressure has increased from governments, scientists, doctors and the public (Wardle and Singerman, 2021). While Facebook, Twitter and YouTube each have made attempts to limit, block and counter misinformation on their platforms [2], their strategies have proved largely unsuccessful and significantly, did not address advertising on their sites (Wardle and Singerman, 2021). In April 2021 a report by European advocacy group Avaaz compared Facebook’s efforts over a year and found that fact-checked misinformation had decreased by just one percent, from 56 percent in 2020 to 55 percent in 2021. The average time it took to label a post was 28 days (Avaaz, 2021). In 2019, Instagram, which was then owned by Facebook, blocked the hashtag #VaccinesKill on the Instagram platform, but Facebook itself only blocked the hashtag in July 2021 (Stelter and Pellico, 2021).

In July 2021 U.S. Surgeon General Vivek Murthy declared that the continued spread of misinformation across social media was prolonging the pandemic and putting further lives at risk (U.S. Department of Health & Human Services, 2021). Murthy called on the social media giants to do more [3] and recommended a suite of responses, including for them to redesign their algorithms to avoid amplifying misinformation; to provide better transparency and access to data for researchers; to increase multilingual staff; to create machine learning algorithms in languages other than English; to penalize repeat disinformation offenders; and to both provide and amplify credible content [4].

While the digital platforms operate independently, they are part of a deeply interconnected network. Groups or individuals seeking a wide audience will publish to more than one social media platform and it is commonplace to cross-share links, allowing the flow of misinformation to spread unchecked through and across platforms. Internet giants are also part of a broader online media ecosystem, which poses further challenges for the removal of misinformation. The Internet Wayback Machine provides access to ‘zombie content’, which is material that has been deleted, and links to the content are circulated via Facebook, YouTube and Twitter (Donovan, 2020; O’Connor and Weatherall, 2020). This has extended the life of content that has been deemed problematic and removed, allowing it to continue causing damage in perpetuity. An example is an article that appeared on 5 April 2020, on the news site Medium. Headlined “Covid-19 had us all fooled, but now we might have finally found its secret,” the author claimed WHO was wrong about the virus and he recommended hydroxychloroquine, the treatment dangerously hyped by President Trump. Within 24 hours of its initial publication the article had been shared 1,200 times on Facebook and received 6,000 interactions. After complaints, Medium removed the article. However, in the three weeks that followed, a link to the deleted article on the Internet Wayback Machine was shared 310,000 times on Facebook and gained 1.6 million interactions (O’Connor and Weatherall, 2020).

In summary, the limited measures taken by the major digital platforms — Facebook, YouTube and Twitter — have proved ineffective as they continue to publish and distribute COVID-19 misinformation, along with problematic advertisements and commercially-driven clickbait, which is further spread via their opaque algorithmic sorting, as well as the networked nature of the Internet.

 

++++++++++

The platform — How Wikipedia works

Like Facebook, YouTube and Twitter, Wikipedia is an Internet platform that publishes and distributes user-generated content. However, unlike those digital behemoths, Wikipedia content is de-personalised and, crucially, carries no advertising. Where those platforms have experienced a shift in public perceptions over the past decade (most notably Facebook, particularly over the past few years), Wikipedia remains the poster child of Web 2.0. In an essay to celebrate Wikipedia’s twentieth anniversary in 2020, Yochai Benkler wrote: “As awareness of surveillance capitalism is becoming clearer and the risk that a handful of companies will use massive amounts of data they collect on each of us to shape both commercial demand and political outcomes, Wikipedia has more than justified the idea that having a significant source of knowledge that is free of markets, and marches to the beat of a different drum, having nothing to do with dollars, is of critical importance.” [5]

The online encyclopedia launched in 2001 as a collaborative platform seeking to amass all human knowledge and make it freely accessible. Founder Jimmy Wales (2004) said: “Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge.” In two decades, it has flourished, becoming a feature of everyday life (Lovink and Tkacz, 2011), a site of information power (Ford, 2016) and the basic knowledge utility of contemporary society [6]. It is celebrated as a highly productive community of collaborative authorship (Forte and Bruckman, 2008), the best-developed attempt to gather all human knowledge in one place (Okoli, et al., 2014), and a living compendium of our knowledge [7].

It provides information on almost every conceivable topic, from biology to physics, medicine, history, the latest blockbuster film and breaking news. Visitors come to the site either directly, or via other Internet giants such as: Google draws from Wikipedia for its infoboxes; YouTube links to Wikipedia as a form of fact checking; voice assistants such as Alexis and Siri draw their information from Wikipedia; and in June 2020, Facebook added Wikipedia boxes to its search results (Jarvey, 2018; Lyu and Fetahu, 2018; Hutchinson, 2020).

There are more than 310 different language Wikipedias, including 22 Indian language Wikipedias, but English is by far the largest and most visited. As of 2 August 2021, English Wikipedia had over 6.3 million articles [8] and recorded 126 billion page views over the previous 12 months [9]. According to Katherine Maher, outgoing CEO of Wikimedia Foundation, every month around one billion people spend a collective 60,000 years reading Wikipedia [10].

The site was established with three core content policies: Neutral Point of View, No Original Research and Verifiability [11]. Neutral Point of View requires articles to be written without bias, by fairly and proportionately representing all significant views. It resonates with academic and journalistic notions of objectivity as a demonstration of impartiality, balance and fairness. The No Original Research policy requires that Wikipedia not publish original material, this includes opinion, observation and individual expertise. The Verifiability policy dovetails with this, determining that all material must be attributed to a reliable source.

While these principles have proved effective in blocking political propaganda, self-promotion, commercial interests and misinformation, they have also shifted authority away from experts to the sources and this is where it mostly distinctly deviates from those traditional institutions of knowledge, encyclopedias, which claimed authority based on the expertise of their contributors. In Wikipedia the authority rests with the legitimacy and credibility of the sources, as defined by its community. In practice, it means information attributed to Fox News, a Facebook post or a doctor on YouTube will likely be removed, while peer-reviewed medical journals, WHO, UNESCO, and the same doctor being quoted in the New York Times, Guardian or BBC World, would be considered credible and allowed to stay. In 2017 the Wikipedia community voted to ban the Daily Mail as a source and in 2018 banned Breitbart News Network [12].

Since its inception Wikipedia has had to establish boundaries of what constitutes knowledge on its platform. In 2014, alternative healers believed they were being unfairly denied representation on the site and started a petition for Wikipedia to recognise various practices, such as Energy Healing and Thought Field Therapy [13]. Founder Jimmy Wales, who has no power over the editing community but remains on the Board of the Wikimedia Foundation and a powerful figurehead for the movement, responded publicly, refusing to recognise the legitimacy of these practices and instead declared that the site was “biased” towards science. Wales wrote on the petition:

“Every single person who signed this petition needs to go back to check their premises and think harder about what it means to be honest, factual, truthful. Wikipedia’s policies around this kind of thing are exactly spot-on and correct,” he posted. “If you can get your work published in respectable scientific journals — that is to say, if you can produce evidence through replicable scientific experiments, then Wikipedia will cover it appropriately. What we won’t do is pretend that the work of lunatic charlatans is the equivalent of ‘true scientific discourse’. It isn’t.” (as cited in Newman, 2014)

Much of mainstream media’s coverage of Wikipedia is either reductive or superficial, treating Wikipedia as a unified voice and amplifying minor error and vandalism [14]. Research has shown just seven percent of edits are acts of vandalism, performed by pranksters, lobbyists and spammers [15]. Misinformation find its way onto the site, but the community has created a series of checks that includes bots designed to examine all new revisions, apply hand-crafted rule sets and detect vandalism [16]; the ability to monitor edits to sensitive pages through personal watchlists, as well as oversight by administrators who can block or ban users (registered editors) or IP addresses (anonymous editors) (Spezzano, et al., 2019).

The platform’s technological affordances enable corrections to be done instantly and efficiently. As part of Wikipedia’s commitment to transparency, every edit is recorded, time-stamped and publicly accessible at the top of each article under the tab ‘view history’. Poor or malicious edits can be reverted with just one click, which returns the article to its previous iteration, and bots designed to pick up specific words will remove them, often within seconds (Andrews, 2020). In August 2021, the site was hacked, and swastikas appeared on multiple pages, including those for Johnny Depp, Raj Kapoor and Edie Sedgwick. They were removed within five minutes right across the platform (Wille, 2021).

Editors may add articles to their personal watchlists so they are notified when an edit is made and can leap in to correct anything poorly sourced or misleading. The community provides administrative oversight of pages which have been subject to repeated vandalism or “edit wars” among users. For example, since 2012 all pages about India, Pakistan and Afghanistan have been overseen by editors with administration rights who can apply discretionary sanctions [17]. The ‘Talk’ tab for each article leads to a chat forum where editors discuss the content and collectively negotiate the narrative.

In Wikipedia’s early years the collaborative approach was highly criticised. U.S. TV host Stephen Colbert coined the term ‘wikiality’ which referred to ‘truth’ based on the will of the majority, not on facts (Colbert, 2016). Nicholas Carr described it as “the corrosive process of compromise” (cited in Mayfield, 2006). The New York Times characterised it as “the nitpicking of the masses versus the authority of the experts” (Johnson, 2006). Wikipedia was constantly compared, unfavourably, to Encyclopaedia Britannica. Judy Heim (2001) wrote in the MIT Technology Review, Wikipedia “will probably never dethrone Brittanica, whose 232-year reputation is based upon hiring world-renowned experts and exhaustively reviewing their articles with a staff of more than a hundred editors.” Former Britannica editor Bob McHenry was equally dismissive, saying in 2005, that Wikipedia was merely a game. “It was always a doomed idea. It was bad from the start. But it’s got the public playing the encyclopedia game” (Orlowski, 2005).

In 2005, Nature put Encyclopedia Britannica and Wikipedia to a test in a famous study of science stories that concluded that the two were roughly equal in terms of accuracy (Giles, 2005). Numerous studies since have demonstrated that content is comparable to traditional encyclopedias (James, 2016; Greenstein and Zhu, 2018; Jemielniak, 2020).

Wikipedia is the most publicly visible node of an extensive open knowledge network which also includes a range of other open collaborative projects under the Wikimedia umbrella, such as Wikimedia Commons (a digital library that allows free use of photos, videos and multimedia content), Wikibooks (free textbooks), Wikiversity (education materials for students from preschool to university) and Wiki Education (resources for using Wikipedia in the classroom). Perhaps the most significant project is Wikidata, a Web-scale platform for structured, curated data, that can be read both by humans and machines. Wikidata acts as a multilingual, collaborative, open knowledge base of more than 90 million entities connected by more than a billion relationships (Turki, et al., 2022). It was developed under the umbrella of the Wikimedia Foundation, and sits alongside the Wikipedia platform, independent but intended to both support and complement Wikipedia’s mission. In Wikidata, each ‘thing’ (e.g., a person, place, disease or drug), is assigned a Q-number as a unique identifier, along with a series of labels, statements and facts. It also makes possible sorting through the vast trove of Wikipedia content pages to collate and sort specific information, for example a search query run through Wikidata could identify the human genes that were updated this year. Where Facebook is secretive about its algorithms and data collection, all of Wikipedia’s metadata is publicly available.

 

++++++++++

Wikipedia’s specialist medical care

Huisman, et al. (2021) identified six qualities and characteristics that make Wikipedia an attractive source for health and medical information: convenience, coverage, topicality, comprehensibility, conciseness, and familiarity. Research undertaken before the emergence of COVID-19 demonstrated that the English Wikipedia to be the most frequently accessed online resource for health and medical information (Heilman and West, 2015) with readership estimated to be in the hundreds of millions (Okoli, et al., 2014). Wikipedia’s medical content comprises more than 155,000 articles, which are supported by more than 950,000 references, most of which are recently published in highly respected scientific journals, such as the New England Journal of Medicine, Cochrane Database of Systematic Reviews, Lancet, and Nature (Mendes, et al., 2021).

In 2017, research found that medical pages were mostly written by a core group of medical professionals. It noted that the collaboration between health and science experts with non-experts, who were experienced in the ways of Wikipedia, enhanced both article quality and reach. It also showed that Wikipedia’s medical pages were a source of healthcare information for 50–70 percent of physicians and more than 90 per cent of medical students (Shafee, et al., 2017; see also Mendes, et al., 2021). A pre-COVID study of the spread of anti-vaccine rhetoric showed that Wikipedia played a central role in countering the misinformation by providing accurate information about vaccine safety (Getman, et al., 2018).

Medical articles are among the most rigorously referenced and scrutinized on the site. WikiProject Medicine, started in 2004, is a global collective of doctors and scientists who both write articles, edit and provide oversight of Wikipedia’s medical pages [18]. While pages without sanctions can be edited by anyone, most changes to health-related articles are assessed by this cohort within 24 hours (Mendes, et al., 2021). If the content and references don’t meet their scientific standards, the edits will be reverted. Where the New York Times or Wall Street Journal would be considered credible sources by the mainstream editing community, the medical editors insist on textbooks, peer-reviewed papers published in medical journals, or reports from prominent medical institutes. Preprints are not permitted: the information must have been peer reviewed and published (Ward, 2021). As of April 2019, the 34,324 English articles maintained by WikiProject Medicine were four times more visited than other Wikipedia articles, averaging 5,875,470 daily visits for all articles (Maggio, et al., 2020).

Other projects specifically related to health include: WikiJournal of Medicine, which publishes peer reviewed medical papers; Wiki Project Med Foundation, which seeks collaboration with health partners, such as World Health Organisation, Cancer Research UK and U.S. National Institutes of Health; and SWASTHA (Special Wikipedia Awareness Scheme for Healthcare Affiliates), which focuses on health information across 10 Indian language Wikipedias [19].

Wikidata also provides data for medical researchers. During the Zika pandemic of 2015–2016, Wikidata developed, documented and refined sets of queries about the epidemic, the underlying pathogen, the disease and diagnostic or therapeutic options. It piloted methods to integrate distributed knowledge from multiple databases to build a consistent semantic representation of a topic for which relevant concepts were often not yet readily available (Turki, et al., 2022).

Bots also play an integral role on the Wikipedia platform. They are software designed to ‘make automated edits without the necessity of human decision-making’ [20]. They are built by editors and, once approved by the rest of the community, are provided with their own user page and user group, with differing levels of access and administrative rights. Bots have been created to identify and undo vandalism, enforce bans, check spelling, create inter-language links, import content automatically, mine data, identify copyright violations and greet new editors (Tsvetkova, et al., 2017). Administrative bots perform policing tasks, such as blocking spam; combat bots seek out vandalism; spellchecking bots check language and make corrections; ban enforcement bots can block an editor from Wikipedia and, thus, take away his or her editing rights (Niederer and Dijck, 2010).

These were the content policies, publishing framework and editorial culture of Wikipedia, when WHO declared it a Public Health Emergency in January 2020 (World Health Organisation [WHO], 2020). The COVID-19 pandemic has broken Wikipedia’s own records for readership, editing and number of pages (Heilman cited in Ward, 2021; WikiMedia Foundation [WMF], 2021).

 

++++++++++

Wikipedia covers COVID-19

In the first full year of the COVID-19 pandemic, 97,000 Wikipedia editors collaborated on 6,950 articles in 188 languages, performing 983,000 edits, which were read more than 653 million times (WikiMedia Foundation [WMF], 2021; Heilman cited in Ward, 2021). Page topics ranged from medical information to mapping contagion spread, vaccination rates and deaths, at both the hyperlocal level, identifying suburbs and towns, as well as documenting the unfolding crisis across cities, nations, regions and globally. Other articles covered the impacts of the pandemic on travel, human rights, the world economy, pop culture, sport, science, music, oil prices and fashion (Ward, 2021). On 18 March 2020, a week after WHO declared the COVID-19 outbreak to be a global pandemic the Wikipedia community moved all pages about COVID-19 to ‘general sanctions’ level of protection [21], which was updated to ‘discretionary sanctions’ on 17 June 2021 [22].

The main article about the virus itself, ‘COVID-19’, continues to be watched over by 1035 editors across the globe, providing round-the-clock oversight. By September 2021, they had reverted 330 edits [23]. Similarly, the article ‘COVID-19 pandemic’ is watched by 1919 editors, who had reverted 697 edits [24]. Canadian emergency physician Dr. James Heilman, a member of Wiki Project Med Foundation, said: “As soon as anyone makes a change to that article, it will be rapidly reviewed by people who will dig into the sources for the changes. They’ll critique you, they’ll revert you, they’ll adjust the wording if they don’t think it reflects the source, or if it’s not supported by the source in question” (Heilman cited in Ward, 2021).

The Talk pages that sit behind each article function as a chat forum for editors, providing public transparency into robust discussions about latest figures and new research. “Our goal is not to necessarily include the most breaking information on every topic; we take a slower approach. We give the scientific process a little more time to work before we present those conclusions as solidly grounded.” (Heilman cited in Ward, 2021). Indian-Swedish neuroscientist Netha Hussain, another key medical contributor to COVID-19 pages, described it as “slow” information. “Unlike the news media we don’t have to push it out there. We wait longer,” she told a panel at the recent conference, Wikimania 2021 [25]. And yet alongside the considered, medical information, Wikipedia also performs well as a news platform, constantly updating its figures for contagion, vaccinations and deaths, at the global, local and hyperlocal level.

Wikipedia also acts as a counter to misinformation that is circulating online. It deals with conspiracy theories and dodgy cures by publishing and debunking them via references to credible sources. The page “COVID-19 misinformation” is extensive, including lists of origin theories (leaked from a Wuhan lab, stolen from a Canadian lab, embedded in a meteor, sold to China by an American scientist, was carried in the polio vaccine); prevention techniques (cocaine, vibrations from clapping, vegetarian immunity); and treatments (cowdung, pouring mustard oil into the nose, touching the TV screen to receive healing from a U.S. preacher). Embedded on the page is a video of President Donald Trump recommending ultraviolet light and drinking bleach, along with the caption “There is no evidence that either could be a viable method” and a link to a New York Times story where scientists refute the claims [26].

 

++++++++++

Why Wikipedia is succeeding against the infodemic

Wikipedia, with its transparency, collaborative practices and editorial framework, denotes a change in perceptions of authority. As Clay Shirky attested in 2009, before Brittanica, most encyclopedias derived their authority from their authors. Brittanica shifted that authority to their brand, establishing themselves as a credible institution of knowledge. Readers could trust that Brittanica would seek out experts in each field. Wikipedia represents a further shift, asking readers to trust their processes. “As long as you can see how Wikipedia’s working, and can see that the results are acceptable, you can come over time to trust that” [27]. Wikipedia’s processes are proving singularly successful in the very specific circumstances of this global pandemic, and its accompanying misinformation which is spreading via digital platforms. This can be attributed to three dominant factors: the volunteer labour performed by editors, the emphasis on rigorous referencing and credible sources and the technological capabilities of the platform.

Editors

Wikipedia is fuelled by the altruistic commitment of a body of amateurs and medical experts, working collaboratively and asynchronously, under an operating model of self-governance. The precise number and locations of editors is difficult to quantify, as editing can be done by registered and unregistered users [28] and can vary from removing a comma to creating a new 2,000-word article. Nevertheless, for the past two years, English Wikipedia has averaged input from 415,000 editors per month, who made one or more edits [29]. For comparison, the next largest cohorts are Spanish Wikipedia, which averaged 89,000 [30] editors per month, and both French [31] and German [32] Wikipedias, which averaged 57,000 editors per month.

Editors are not journalists, but the community has evolved an editorial framework that resembles professional newsroom practices. Some take an authorial role, adding original content, others act as subeditors, correcting grammar and improving syntax. Experienced editors oversee newcomers, advising and correcting their work, and are ultimately the arbiters of what information is included and what is deleted. When disagreements occur, or edit wars, there are avenues for the higher authority of “admins” to step in and make determinations. The Talk pages function like a virtual newsroom, enabling discussions which can range from editorial matters such as what should be in the lead or headings; to matters of content and questioning the credibility of sources. According to Wikipedia protocols, the writing style of each article should be “straightforward” and written in “just-the-facts style” so as to be understood by a general reader without advanced education [33]. This is achieved through such recognisable news writing conventions as the inverted-pyramid structure, with its highly efficient opening paragraph that provides the Who, What, Where and When of the topic, and information boxes, prominently displayed, which provide essentialised information at a glance (Avieson, 2018).

The collaboration of medical and scientific expert with non-expert ensures content is both scientifically rigorous while also accessible to the layman. Their constant oversight also provides the “human in the loop”, that is a stark contrast to the algorithms that drive Facebook and YouTube [34]. The asynchronous and global nature of the site allows round-the-clock vigilance over such sensitive topics as COVID-19. Andrew Pattison, WHO Digital Media Content Manager, described collaborating with Wikipedia during the COVID-19 pandemic as “like having an army to work with” (cited in McNeil, 2020).

Sources

The two core content policies of No Original Research and Verifiability disallow other journalistic practices of observation, interviews and informed opinion. Content creation is solely discursive, that is articles are constructed from what has already been published. For most Wikipedia pages, credible news media is considered acceptable, while in the case of medical stories the bar is set higher, at scientifically rigorous, peer-reviewed publications. This emphasis on attribution to credible sources, that is identifying the provenance of all information, serves to establish both responsibilities for editors, as well as rights for readers [35] and is the first step in blocking misinformation. Scientific knowledge is constantly contested and updated, and the drive for the latest medical research during the pandemic has been especially urgent — such as updates about how the virus spreads, new variants that may be emerging, possible cures and the efficacy and safety of vaccines. The emphasis on peer-reviewed sources slows down the process to better reflect the rigour of scientific practice.

The policy of Verifiability also serves to equalize the amateur and the specialist. During a dispute, editors do not defer to an external authority, such as medical professionals, but to Wikipedia’s own protocols, which are underpinned by those three core policies of Verifiability amd No Original Research, as well as Neutrality, with its implication of balance and fairness. The ethos is not to determine the truth, but to establish a community consensus within that editorial and philosophical framework. According to James Heilman, the sources are “key”. “Your background in that way is much less important, because the sources really reign supreme. We reference our articles more densely than one would see in a traditional journal or textbook. And the reason is because we have no way to verify that our editors are experts, and thus the material really needs to stand upon the sources entirely” (Heilman, cited in Ward, 2021).

Technological affordances

The open-source platform has the technological capabilities to collate and distribute both the latest health updates, as well as scientifically proven health information, in ways not possible before Web 2.0 technologies. The site has an advantage over other online platforms in its ability to remove misinformation: poor or malicious edits can be reverted simply and efficiently. Heilman noted the structural advantages Wikipedia provides which the other major digital platforms dont have: “It takes more time and effort to disrupt Wikipedia than it does to restore Wikipedia to a reliable level. It’s the exact opposite on Twitter and Facebook, where it takes a second to spread false news, while getting those lies removed will take a lot of time and effort.”

Automated bots are also part of the Wikipedia misinformation armoury. They are designed specifically to detect and revert malicious edits, using key words and redflag material. Niederer and Djick (2010) describe Wikipedia’s sophisticated network of bots as a complementary part of a sociotechnical system, arguing that it is the intricate collaboration between large numbers of human users and these sophisticated automated systems that defines Wikipedia’s ultimate success.

As an open, collaborative wiki, Wikipedia articles never close, rather they are constantly being added to, updated and revised through a contemporary lens or when new information on a topic is published. Each page is a process, not a product, and as a result, it is never finished [36] and what traditionally would be termed ‘publication’ never materializes [37]. This is a function of the open, intellectual commons that underpins Wikipedia’s collaborative philosophy as well as its Web 2.0 affordances. According to the processual approach, commons are always in a state of becoming, not a state of being (Broumas, 2017). This allows both the slow, careful, scientific approach of medical editors Heilman and Hussain, to processing new research and data, while also accommodating the latest news that is part of a fast-moving pandemic, as can be seen in other COVID-19-related pages. It also allows the site to sidestep the practise of repurposing zombie material. While in theory a screenshot or a link to a particular iteration, cointaining misinformation that stayed up long enough to be recorded, could be circulated via other digital platforms, it would be clumsy and ineffective. Visitors to the site see the current iteration, and all previous iterations are readily available. The iterative process means Wikipedia pages are virtually impervious to the objectives of circulating zombie content.

 

++++++++++

Conclusion

In this global pandemic, health communicators face specific challenges to reach their audiences with vital health information. Most public health experts have quality content but face high costs to reach wide audiences. This makes Wikipedia particularly attractive, as their readers come to the site actively seeking information. For example, the topic of ‘vaccine hesitancy’ appeared in Wikipedia’s top 10 most popular search topics for 2020 with an extraordinary 65,212,508 requests (Rasberry and Mietchen, 2021). Wikipedia is efficient at providing factual information, written in accessible layman’s language, right at the point where people are actively seeking it. Without needing to meet commercial imperatives, this is the platform’s only agenda and the reason readers come to the 300+ different language sites. It is also free to anyone with Internet access, which by December 2020, was 65.6 percent of the world [38].

Perhaps most significantly for the flow of misinformation, is that unlike the interconnectivity of other online media platforms, Wikipedia is largely a one-way street. While Facebook, YouTube and Google refer their readers to the site for fact-checking, Wikipedia does not return the favour. Without a commercial agenda, its readers are not directed to other content by an algorithm, nor are they subjected to advertisements or clickbait, hijacking their attention.

In an essay written prior to the COVID-19 pandemic to mark Wikipedia’s twentieth year anniversary, the site was lauded as the “grown up” of the Web and the best medicine against the scourge of false information [39]. That somewhat grandiose claim has proved to be prescient. Wikipedia’s practices in this pandemic demonstrate the enduring possibilities of participatory culture.

The site is winning the battle against COVID-19 misinformation through the combination of an enthusiastic, volunteer army (those nit-picking masses), working within the disciplined schema of rigorous referencing to credible sources, on a platform designed for transparency and efficient editing. This editorial framework, combined with sanctions, expert oversight and more stringent referencing rules, is showing Wikipedia to be a significant platform for health information during the COVID-19 pandemic. End of article

 

About the author

Dr. Bunty Avieson is Senior Lecturer in Journalism and Media at the University of Sydney (Australia).
E-mail: bunty [dot] avieson [at] sydney [dot] edu [dot] au

 

Notes

1. Wardle and Derakhshan, 2017, p. 5. In this paper, the term misinformation will be used to represent both types of false information, as the intent of the authors and distributors is neither clear nor relevant to the analysis, so for ease of communication it will be simpler to use misinformation as an umbrella term.

2. https://about.fb.com/news/2020/12/coronavirus/#misinformation-update; https://blog.twitter.com/en_us/topics/company/2020/covid-19#protecting; https://support.google.com/youtube/answer/9891785; https://www.mobihealthnews.com/news/google-takes-covid-19-vaccine-misinformation.

3. U.S. Department of Health & Human Services, 2021, p. 5.

4. U.S. Department of Health & Human Services, 2021, p. 12.

5. Benkler, 2020, p. 47.

6. Benkler, 2020, p. 46.

7. Maher, 2020, p. 326.

8. https://meta.wikimedia.org/wiki/List_of_Wikipedias#Grand_Total.

9. https://stats.wikimedia.org/#/en.wikipedia.org/reading/total-page-views/normal|bar|1-year|~total|monthly.

10. Maher, 2020, pp. 325–326.

11. https://en.wikipedia.org/wiki/Wikipedia:Core_content_policies.

12. Benjakob and Harrison, 2020, p. 32.

13. https://www.change.org/p/jimmy-wales-founder-of-wikipedia-create-and-enforce-new-policies-that-allow-for-true-scientific-discourse-about-holistic-approaches-to-healing.

14. Benjakob and Harrison, 2020, p. 35.

15. Adler, et al., 2011, p. 278.

16. Ibid.

17. https://en.wikipedia.org/wiki/Wikipedia:Arbitration_Committee/Discretionary_sanctions.

18. https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Medicine.

19. https://en.wikipedia.org/wiki/Wikipedia:SWASTHA.

20. http://en.wikipedia.org/wiki/Wikipedia:Bot_policy

21. https://en.wikipedia.org/wiki/Wikipedia:General_sanctions/COVID-19.

22. https://en.wikipedia.org/wiki/Wikipedia:Arbitration/Requests/Case/COVID-19.

23. As of 19 September 2021. https://xtools.wmflabs.org/articleinfo/en.wikipedia.org/COVID-19.

24. As of 19 September 2021. https://xtools.wmflabs.org/articleinfo/en.wikipedia.org/COVID-19_pandemic.

25. https://www.youtube.com/watch?v=vsi4WHXEUtI.

26. https://en.wikipedia.org/wiki/COVID-19_misinformation.

27. Shirky, cited in Gauntlett, 2009, p. 42.

28. Unregistered users are visible by their IP addresses, which can be geolocated, while registered users have a user page where they choose how much personal information they reveal, including their geolocations.

29. https://stats.wikimedia.org/#/en.wikipedia.org/contributing/editors/normal%7Cline%7C2-year%7Ceditor_type~anonymous*user%7Cmonthly.

30. https://stats.wikimedia.org/#/es.wikipedia.org/contributing/editors/normal|line|2-year|editor_type~anonymous*user|monthly.

31. https://stats.wikimedia.org/#/fr.wikipedia.org/contributing/editors/normal|line|2-year|editor_type~anonymous*user|monthly.

32. https://stats.wikimedia.org/#/de.wikipedia.org/contributing/editors/normal|line|2-year|editor_type~anonymous*user|monthly.

33. https://en.wikipedia.org/wiki/Wikipedia:Make_technical_articles_understandable.

34. Keegan, 2020, p. 64.

35. Ford, 2020, p. 196.

36. Shirky, 2008, p. 119.

37. Jemielniak, 2014, p. 184.

38. https://www.internetworldstats.com/stats.htm.

39. Benjakob and Harrison, 2020, p. 34.

 

References

B. Thomas Adler, Luca de Alfaro, Santiago M. Mola-Velasco, Paolo Rosso and Andrew G. West, 2011, &pldquo;Wikipedia vandalism detection: Combining natural language, metadata, and reputation features,” In: Alexander Gelbukh (editor). Computational linguistics and intelligent text processing. CICLing 2011. Lecture Notes in Computer Science, volume 6609. Berlin: Springer, pp. 277–288.
doi: https://doi.org/10.1007/978-3-642-19437-5_23, accessed 16 October 2022.

Joachim Allgaier and Anna Lydia Svalastog, 2015. “The communication aspects of the Ebola virus disease outbreak in Western Africa — do we need to counter one, two, or many epidemics?” Croatian Medical Journal, volume 56, number 5, pp. 496–499.
doi: https://doi.org/10.3325/cmj.2015.56.496, accessed 16 October 2022.

Daniel Allington, Bobby Duffy, Simon Wessely, Nayana Dhavan and James Rubin, 2020. “Health-protective behaviour, social media usage and conspiracy belief during the COVID-19 public health emergency,” Psychological Medicine, volume 51, number 10, pp. 1,763–1,769.
doi: https://doi.org/10.1017/S003329172000224X, accessed 16 October 2022.

Travis Andrews, 2020. “Covid-19 is one of Wikipedia’s biggest challenges ever. Here’s how the site is handling it,” Washington Post (7 August), at https://www.washingtonpost.com/technology/2020/08/07/wikipedia-covid-coronavirus/, accessed 16 October 2022.

Oberiri Destiny Apuke and Bahiyah Omar, 2021. “Fake news and COVID-19: Modelling the predictors of fake news sharing among social media users,” Telematics and Informatics, volume 56, 101475.
doi: https://doi.org/10.1016/j.tele.2020.101475, accessed 16 October 2022.

Avaaz, 2021. “Left behind: How Facebook is neglecting Europe’s infodemic” (20 April), at https://secure.avaaz.org/campaign/en/facebook_neglect_europe_infodemic/, accessed 16 October 2022.

Bunty Avieson, 2019. “Breaking news on Wikipedia: Collaborating, collating and competing,” First Monday, volume 24, number 5, at https://firstmonday.org/article/view/9530/7780, accessed 16 October 2022.
doi: https://doi.org/10.5210/fm.v24i5.9530, accessed 16 October 2022.

Omer Benjakob and Stephen Harrison, 2020. “From anarchy to wikiality, glaring bias to good cop: Press coverage of Wikipedia’s first two decades,” In: Joseph Reagle and Jackie Koerner (editors). Wikipedia @ 20: Stories of an incomplete revolution. Cambridge, Mass.: MIT Press, pp. 21–42.
doi: https://doi.org/10.7551/mitpress/12366.003.0005, accessed 16 October 2022.

Yochai Benkler, 2020. “From utopia to practice and back,” In: Joseph Reagle and Jackie Koerner (editors). Wikipedia @ 20: Stories of an incomplete revolution. Cambridge, Mass.: MIT Press, pp. 43–54.
doi: https://doi.org/10.7551/mitpress/12366.003.0006, accessed 16 October 2022.

Zareen Bheekhun, Geraldine Lee and Silvia Camporesi, 2021. “Challenges of an ‘infodemic’: Separating fact from fiction in a pandemic,” International Emergency Nursing, volume 57, 101029.
doi: https://doi.org/10.1016/j.ienj.2021.101029, accessed 16 October 2022.

Antonios Broumas, 2017. “The ontology of the intellectual commons,” International Journal of Communication, volume 11, at http://ijoc.org/index.php/ijoc/article/view/6347, accessed 16 October 2022.

Stephen Colbert, 2016. “The Colbert Report” (31 July); see https://en.wikipedia.org/wiki/Wikipedia:Wikiality_and_Other_Tripling_Elephants, accessed 16 October 2022.

Joan Donovan, 2020. “Covid hoaxes are using a loophole to stay alive — even after content is deleted,” MIT Technology Review (30 April), at https://www.technologyreview.com/2020/04/30/1000881/covid-hoaxes-zombie-content-wayback-machine-disinformation, accessed 16 October 2022.

Gunter Eysenbach, 2002. “Infodemiology: The epidemiology of (mis)information,” American Journal of Medicine, volume 113, number 9 (15 December), pp. 763–765.
doi: https://doi.org/10.1016/s0002-9343(02)01473-0, accessed 16 October 2022.

Alan Feuer, 2014. “The Ebola conspiracy theories,” New York Times (18 October), at http://www.nytimes.com/2014/10/19/sunday-review/the-ebola-conspiracy-theories.html, accessed 16 October 2022.

Heather Ford, 2020. “Rise of the underdog,” In: Joseph Reagle and Jackie Koerner (editors). Wikipedia @ 20: Stories of an incomplete revolution. Cambridge, Mass.: MIT Press, pp. 189–204.
doi: https://doi.org/10.7551/mitpress/12366.003.0017, accessed 16 October 2022.

Andrea Forte and Amy Bruckman, 2008. “Scaling consensus: Increasing decentralization in Wikipedia governance,” Proceedings of the 41st Annual Hawaii International Conference on System Sciences (HICSS 2008).
doi: https://doi.org/10.1109/HICSS.2008.383, accessed 16 October 2022.

David Gauntlett, 2009. “Case study: Wikipedia,” In: Glen Creeber and Royston Martin (editors). Digital cultures: Understanding new media. Maidenhead, Berkshire: Open University Press, pp. 39–45.

Reekah Getman, Mohammad Helmi, Hal Roberts, Alfa Yansane, David Cutler and Brittany Seymour, 2018. “Vaccine hesitancy and online information: The influence of digital networks,” Health Education & Behavior, volume 45, number 4, pp. 599–606.
doi: https://doi.org/10.1177/1090198117739673, accessed 16 October 2022.

Tedros A. Ghebreyesus, 2020. “Munich Security Conference” (15 February), at https://www.who.int/dg/speeches/detail/munich-security-conference, accessed 16 October 2022.

Jim Giles, 2005. “Internet encyclopaedias go head to head,” Nature volume 438, number 7070 (14 December), pp. 900–901.
doi: https://doi.org/10.1038/438900a, accessed 16 October 2022.

Shane Greenstein and Feng Zhu, 2018. “Do experts or crowd-based models produce more bias? Evidence from Encyclopedia Brittanica and Wikipedia,” MIS Quarterly, volume 42, number 3, pp. 945–959.
doi: https://doi.org/10.25300/MISQ/2018/14084, accessed 16 October 2022.

Sten Hansson, Kati Orru, Sten Torpan, Asta Bäck, Austeja Kazemekaityte, Sunniva Frislid Meyer, Johanna Ludvigsen, Lucia Savadori, Alessandro Galvagni and Ala Pigrée, 2021. “COVID-19 information disorder: Six types of harmful information during the pandemic in Europe,” Journal of Risk Research, volume 24, numbers 3–4, pp. 380–393.
doi: https://doi.org/10.1080/13669877.2020.1871058, accessed 16 October 2022.

Judy Heim, 2001. “Free the encyclopedias!” MIT Technology Review (4 September), at https://www.technologyreview.com/2001/09/04/235538/free-the-encyclopedias/, accessed 16 October 2022.

Andrew Hutchinson, 2020. “Facebook adds Wikipedia Knowledge boxes in search results,” Social Media Today (9 June), at https://www.socialmediatoday.com/news/facebook-adds- wikipedia-knowledge-boxes-in-search-results/579510/, accessed 16 October 2022.

Richard James, 2016. “WikiProject medicine: Creating credibility on consumer health,” Journal of Hospital Librarianship, volume 16, number 4, pp. 344–351.
doi: https://doi.org/10.1080/15323269.2016.1221284, accessed 16 October 2022.

Natalie Jarvey, 2018. “SXSW: YouTube CEO enlists Wikipedia to curb fake news video,” Hollywood Reporter (13 March), at https://www.hollywoodreporter.com/news/sxsw-youtube-ceo-enlistswikipedia-curb-fake-news-videos-1094314, accessed 16 October 2022.

Dariusz Jemielniak, 2020. “Wikipedia as a role-playing game, or Why some academics do not like Wikipedia,” In: Joseph Reagle and Jackie Koerner (editors). Wikipedia @ 20: Stories of an incomplete revolution. Cambridge, Mass.: MIT Press, pp. 151–157.
doi: https://doi.org/10.7551/mitpress/12366.003.0014, accessed 16 October 2022.

Dariusz Jemielniak, 2014. Common knowledge? An ethnography of Wikipedia. Stanford, Calif.: Stanford University Press.
doi: https://doi.org/10.11126/stanford/9780804789448.001.0001, accessed 16 October 2022.

George Johnson, 2006. “The nitpicking of the masses vs the authority of the experts,” New York Times (3 January), at https://www.nytimes.com/2006/01/03/science/the-nitpicking-of-the-masses-vs-the-authority-of-the-experts.html, accessed 16 October 2022.

Brian Keegan, 2020. “An encyclopedia with breaking news,” In: Joseph Reagle and Jackie Koerner (editors). Wikipedia @ 20: Stories of an incomplete revolution. Cambridge, Mass.: MIT Press, pp. 55–70.
doi: https://doi.org/10.7551/mitpress/12366.003.0007, accessed 16 October 2022.

Kaj-Kolja Kleineberg and Marián Boguñá, 2016. “Competition between global and local online social networks,” Scientific Reports, volume 6, article number 25116.
doi: https://doi.org/10.1038/srep25116, accessed 16 October 2022.

Heidi Oi-Yee Li, Adrian Bailey, David Huynh and James Chan, 2020. “YouTube as a source of information on COVID-19: A pandemic of misinformation?” BMJ Global Health, volume 5, e002604.
doi: http://dx.doi.org/10.1136/bmjgh-2020-002604, accessed 16 October 2022.

Gert Lovink and Nathaniel Tkacz (editors), 2011. “Critical point of view: A Wikipedia reader,” Institute of Network Cultures, at https://www.networkcultures.org/_uploads/%237reader_Wikipedia.pdf, accessed 16 October 2022.

Victor Luckerson, 2014. “Fear, misinformation, and social media complicate Ebola fight,” Time (8 October), at http://time.com/3479254/ebola-social-media/, accessed 16 October 2022.

Lijun Lyu and Besnik Fetahu, 2018. “Real time event-based news suggestions for Wikipedia Pages from News Stream,” WWW ’18: Companion Proceedings of the Web Conference 2018, pp. 1,793–1,799.
doi: https://doi.org/10.1145/3184558.3191642, accessed 16 October 2022.

Lauren A. Maggio, Ryan M. Steinberg, Tiziano Piccardi and John M. Willinsky, 2020. “Reader engagement with medical content on Wikipedia,” eLife, volume 9, e52426..
doi: https://doi.org/10.7554/eLife.52426, accessed 16 October 2022.

Katherine Maher, 2020. “Capstone: Making history, building the future together,” In: Joseph Reagle and Jackie Koerner (editors). Wikipedia @ 20: Stories of an incomplete revolution. Cambridge, Mass.: MIT Press, pp. 325–343.
doi: https://doi.org/10.7551/mitpress/12366.003.0028, accessed 16 October 2022.

Donald G. McNeil, Jr., 2020, “Wikipedia and W.H.O. join to combat Covid-19 misinformation,” New York Times (22 October), at https://www.nytimes.com/2020/10/22/health/wikipedia-who-coronavirus-health.html, accessed 16 October 2022.

Thiago Bosco Mendes, Jennifer Dawson, Shani Evenstein Sigalov, Nancy Kleiman, Kathryn Hird, Olle Terenius, Diptanshu Das, Nour Geres and Amin Azzam, 2021. “Wikipedia in health professional schools: From an opponent to an ally,” Medical Science Educator, volume 31, pp. 2,209–2,216.
doi: https://doi.org/10.1007/s40670-021-01408-6, accessed 16 October 2022.

Lily Newman, 2014. “Jimmy Wales gets real and sassy, about Wikipedia’s holistic healing coverage,” Slate (27 March), at https://slate.com/technology/2014/03/jimmy-wales-denies-petition-from-advocates-of-holistic-healing-about-wikipedia-s-coverage.html, accessed 16 October 2022.

Sabine Niederer and José van Dijck, 2010. “Wisdom of the crowd or technicity of content? Wikipedia as a sociotechnical system,” New Media & Society, volume 12, number 8, pp. 1,368–1,387.
doi: https://doi.org/10.1177/1461444810365297, accessed 16 October 2022.

Rasmus Kleis Nielsen, Anne Schulz and Richard Fletcher, 2021. “An ongoing infodemic: How people in eight countries access news and information about coronavirus a year into the pandemic,” Reuters Institute for the Study of Journalism, University of Oxford (27 May), at https://reutersinstitute.politics.ox.ac.uk/ongoing-infodemic-how-people-eight-countries-access-news-and-information-about-coronavirus-year, accessed 16 October 2022.

Cailin O’Connor and James Owen Weatherall, 2020. “Why false claims about COVID-19 refuse to die: Tracking the information zombie apocalypse,” Nautilus (15 April), at https://nautil.us/why-false-claims-about-covid_19-refuse-to-die-237777/, accessed 16 October 2022.

Chitu Okoli, Mohamad Mehdi, Mostafa Mesgari, Finn Årup Nielsen and Arto Lanamki, 2014. “Wikipedia in the eyes of its beholders: A systematic review of scholarly research on Wikipedia readers and readership,” Journal of the Association for Information Science and Technology, volume 65, number 12, pp. 2,381–2,403.
doi: https://doi.org/10.1002/asi.23162, accessed 16 October 2022.

Andrew Orlowski, 2005. “Why Wikipedia isn’t like Linux: and why Brittanica isn’t sweating,” The Register (27 October), at https://www.theregister.com/2005/10/27/wikipedia_britannica_and_linux/, accessed 16 October 2022.

Lane Rasberry and Daniel Mietchen, 2021. “Wikipedia for multilingual COVID-19 vaccine education at scale,” Research Ideas and Outcomes, volume 7, e70042.
doi: https://doi.org/10.3897/rio.7.e70042, accessed 16 October 2022.

David J. Rothkopf, 2003. “When the buzz bites back,” Washington Post (11 May), at https://www.washingtonpost.com/archive/opinions/2003/05/11/when-the-buzz-bites-back/bc8cd84f-cab6-4648-bf58-0277261af6cd/, accessed 16 October 2022.

Thomas Shafee, Gwinyai Masukume, Lisa Kipersztok, Diptanshu Das, Mikael Häggström and James Heilman, 2017. “The evolution of Wikipedia’s medical content: Past, present and future,” Journal of Epidemiology & Community Health, volume 71, number 10.
doi: https://doi.org/10.1136/jech-2016-208601, accessed 16 October 2022.

Clay Shirky, 2008. Here comes everybody: The power of organizing without organizations. New York: Penguin Press.

Felix M. Simon and Chico Q. Camargo, 2021. “Autopsy of a metaphor: The origins, use and blind spots of the ‘infodemic’,” New Media & Society (20 July).
doi: https://doi.org/10.1177/14614448211031908, accessed 16 October 2022.

Francesca Spezzano, Kkelsey Suyehira and Laxmi Amulya Gundala, 2019. “Detecting pages to protect in Wikipedia across multiple languages,” Social Network Analysis and Mining, volume 9, article number 10.
doi: https://doi.org/10.1007/s13278-019-0555-0, accessed 16 October 2022.

Brian Stelter and Katie Pellico, 2021. “Instagram blocked the #VaccinesKill hashtag two years ago. Facebook only just now got around to doing it,” CNN (21 July), at https://edition.cnn.com/2021/07/21/tech/facebook-vaccineskill-hashtag/, accessed 16 October 2022.

Milena Tsvetkova, Ruth Garca-Gavilanes, Luciano Floridi and Taha Yasseri, 2017. “Even good bots fight: The case of Wikipedia,” PLoS One, volume 12, number 2, e0171774.
doi: https://doi.org/10.1371/journal.pone.0171774, accessed 16 October 2022.

Houcemmedine Turki, Mohamed Ali Hadj Taieb, Mohamed Ali, Thomas Shafee, Tiago Lubiana, Dariusz Jemielniak, Mohamed Ben Aouicha, Jose Emilio Labra Gayo, Eric A. Youngstrom, Mus’ab Banat, Diptanshuj Das and Daniel Mietchen, on behalf of WikiProject COVID-19, 2022. “Representing COVID-19 information in collaborative knowledge graphs: The case of Wikidata,” Semantic Web, volume 13, number 2, pp. 233–264.
doi: https://doi.org/10.3233/SW-210444, accessed 16 October 2022.

U.S. Department of Health & Human Services, 2021. “U.S. Surgeon General issues advisory during COVID-19 vaccination push warning American public about threat of health misinformation” (15 July), at https://www.hhs.gov/about/news/2021/07/15/us-surgeon-general-issues-advisory-during-covid-19-vaccination-push-warning-american.html, accessed 16 October 2022.

Siddarth Venkataramakrishnan, 2020. “The real fake news about Covid-19,” Financial Times (25 August), at https://www.ft.com/content/e5954181-220b-4de5-886c-ef02ee432260, accessed 16 October 2022.

Jimmy Wales, 2004. “Wikipedia founder Jimmy Wales responds,” Slashdot (28 July), at https://slashdot.org/story/04/07/28/1351230/wikipedia-founder-jimmy-wales-responds, accessed 16 October 2022.

Tricia Ward, 2021. “How Wikipedia fought COVID misinformation,” Medscape (10 March), at https://www.medscape.com/viewarticle/947175, accessed 16 October 2022.

Claire Wardle and Eric Singerman, 2021. “Too little, too late: Social media companies’ failure to tackle vaccine misinformation poses a real threat,” BMJ, volume 372, number 26 (21 January).
doi: https://doi.org/10.1136/bmj.n26, accessed 16 October 2022.

Claire Wardle and Hossein Derakhshan, 2017. “Information disorder: Towards an interdisciplinary framework for research and policy-making,” Council of Europe report, DGI(2017)09, at https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-forresearc/168076277c, accessed 16 October 2022.

WikiMedia Foundation (WMF), 2021. “Wikipedia and COVID-19: Explore the data,” at https://wikimediafoundation.org/covid19/data/#section-1, accessed 16 October 2022.

Matt Wille, 2021. “Uh, Wikipedia was filled with enormous swastikas this morning,” Input (17 August), at https://www.inputmag.com/tech/wikipedia-was-filled-with-enormous-swastikas-this-morning, accessed 16 October 2022.

World Health Organisation (WHO), 2020. “Statement on the second meeting of the International Health Regulations (2005) Emergency Committee regarding the outbreak of novel coronavirus (2019-nCoV)” (30 January), at https://www.who.int/news/item/30-01-2020-statement-on-the-second-meeting-of-the-international-health-regulations-(2005)-emergency-committee-regarding-the-outbreak-of-novel-coronavirus-(2019-ncov), accessed 16 October 2022.

John Zarocostas, 2020. “How to fight an infodemic,” Lancet, volume 395, number 10225, p. 676 (29 February).
doi: https://doi.org/10.1016/S0140-6736(20)30461-X, accessed 16 October 2022.

Jing Zeng and Chung-hong Chan, 2021. “A cross-national diagnosis of infodemics: Comparing the topical and temporal features of misinformation around COVID-19 in China, India, the US, Germany and France,” Online Information Review, volume 45, number 4, pp. 709–728.
doi: https://doi.org/10.1108/OIR-09-2020-0417, accessed 16 October 2022.

 


Editorial history

Received 19 August 2022; revised 9 October 2022; revised 12 October 2022; accepted 16 October 2022.


Creative Commons License
This paper is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Editors, sources and the ‘go back’ button: Wikipedia’s framework for beating misinformation
by Bunty Avieson.
First Monday, Volume 27, Number 11 - 7 November 2022
https://firstmonday.org/ojs/index.php/fm/article/download/12754/10720
doi: https://dx.doi.org/10.5210/fm.v27i11.12754