Wikipedia has grown to be one of the most visited Web sites in the world. Despite its influence on popular culture and the way we think about knowledge production and consumption, the conversation about why and how it works —or whether it’s credible at all — is ongoing. This paper began as an examination of what the concept of “authority” means in Wikipedia and what role rhetoric might play in manufacturing this authority. But Wikipedia’s editors have functioned well as a community, having collaboratively developed a comprehensive set of social norms designed to place the project before any individual. Hence ideas like authority and rhetoric have only marginal roles in day–to–day activities. This paper takes an in–depth look at these norms and how they work, paying particular attention to a relatively new guideline that exemplifies the spirit of the Wikipedia community — “Gaming the system.”
One of the earliest descriptions of knowledge is Plato’s assertion in The Republic that knowledge is something eternal and elusive, only truly possessed by intellectual elite who can see through to the essence of things — and that essence is final (Gijsbers, 2004). Today, knowledge is more often thought of in terms of an infinite process of construction, destruction, and reconstruction, using building blocks created or recycled by members of a large community to create a tower that extends to the cosmos. Far from elitist, that community is a global, self–selected population with varying contributions and motivations for participation.
In The Wealth of Networks, Benkler (2006) points out a shift in production from that of physical products to that of information. This somewhat typical characterization of the “information age” is progressively magnified by the popularization of the Internet. Indeed, the economy of the Web involves relatively little physicality.
One of the most salient features of this movement is the changing role of the average citizen from one of passive consumer to that of active participant in creating, reporting, vetting, discussing, and reworking ideas and information. Whereas people were able to conduct these activities in the past, forums in which to get your voice out were extremely limited and often completely inaccessible. With no way to be heard, contributions are disconnected from the rest of the process and thus will not be built upon or acknowledged, discouraging future engagement and stagnating construction of the knowledge tower. The Internet gives a forum to anybody who wants one.
Millions of individuals have taken this opportunity to be heard. The question from a knowledge gathering point–of–view then becomes how to collect the best information that all of these people possess. Cass Sunstein takes this query head on in his book Infotopia (2006), analyzing statistical averages, deliberations, prediction markets, and Internet technologies in terms of how well they accomplish this in various situations. While nobody disputes the importance of experts in their respective fields, no one person can know the sum of what every person knows. This is a fundamental concept behind Wikipedia, the free online encyclopedia that anyone can edit.
In the celebrated spirit of the Internet giving anybody a voice, Wikipedia was founded in 2001 as an offshoot of Nupedia. The predecessor was set up as an encyclopedia designed to provide an alternative to Encyclopædia Britannica (http://info.eb.com/). Through a rigorous peer review process, experts were asked to submit content to create a free, accessible encyclopedia unbound by paper or money. Wikipedia’s initial role was as a more open companion encyclopedia to Nupedia where people could collaborate on articles before they were moved into the review pipeline of the primary site.
Wikipedia quickly developed a life of its own, flourishing entirely independent of the site it was intended to serve, which simply couldn’t keep up with the sheer volume of content. Nupedia is now defunct and Wikipedia is the seventh most visited Web site in the world according to Alexa (2008).
The technical foundation of Wikipedia that enables such efficient collaboration is the wiki, a type of software that makes creating, editing, linking, and formatting of information easy. The wiki is one of the seminal tools for what Benkler refers to as “commons–based peer production” , the creation of meaningful goods via coordination and collaboration of large groups of people such that both inputs and outputs remain available for reuse. Similarly, Sunstein thinks of Wikipedia as “an exceptionally fast–moving tradition: Everyone who edits is standing on the shoulders of those who were there earlier.” 
The reason Wikipedia works — or if it works — has been the topic of some analysis since the site’s inception seven years ago. There is an unavoidable conflict inherent in the very premise of a knowledge base anybody can edit. Can we trust text that could have been altered seconds ago by a troublemaker spreading disinformation?
Perhaps the most publicized study of Wikipedia’s credibility in terms of information quality is Nature’s 2005 comparative analysis. That study found Wikipedia’s articles, when reviewed by experts, to be close in accuracy to those on related topics in Encyclopædia Britannica. In the articles reviewed, both encyclopedias contained eight “serious errors,” whereas Britannica suffered 123 total errors as compared to Wikipedia’s 162 (Giles, 2005). Since the time of this study, the number of Wikipedia articles has nearly tripled. According to the data in the “Modeling Wikipedia’s growth” entry on Wikipedia , the number of edits per page, which could be seen as a measure of effort, has steadily risen in a manner roughly proportional to the total article count. This means that the amount of effort put into an average page has also tripled.
Whether or not a similar study conducted today would yield favorable results for Wikipedia, such a positive comparison to what is widely regarded as the standard for summarized knowledge certainly lends itself to an argument in favor of Wikipedia’s credibility. This is especially impressive when you consider Wikipedia’s larger number of articles.
However, what was regarded as equal to Britannica one moment could in the next be replaced with something of questionable accuracy or taste. At the time of writing the most recent attack on the article about George W. Bush (one of the most vandalized pages) changed the then President’s name in the text from “George Walker Bush” to “George Wanker Bush” .
Such malicious edits — known as “vandalism” — are usually very apparent and easily ignored or removed. Occasionally, however, something happens like the hoax that took place in 2005 concerning the article for journalist John Seigenthaler Sr. On 26 May 2005, an anonymous editor added the following text: “John Seigenthaler Sr. was the assistant to Attorney General Robert Kennedy in the early 1960s. For a short time, he was thought to have been directly involved in the Kennedy assassinations of both John, and his brother, Bobby. Nothing was ever proven.”  The baseless piece of disinformation remained visible in the article for four months before being discovered and changed, triggering one of what would be several media frenzies over Wikipedia’s validity as a credible resource.
In a 2007 study, Priedhorsky, et al. analyzed millions of Wikipedia pages to assess vandalism. They found that if you visited 10,000 different pages in Wikipedia, 37 of them would have some kind of vandalism . They categorized this vandalism into seven categories: misinformation, mass deletion; partial deletion; offensive (obscenities, hate speech, etc); spam (advertisements or non–useful links); nonsense (meaningless to the reader); and other .
Similarly illustrating the relatively low level of damage vandalism causes, Viégas, et al. studied “history flow” visualizations of edits and found that “[d]esipte some well–publicized problems, a statistical scan of the [sample] shows that the basic fast–repair characteristics of Wikipedia remain strong.”  Specifically, they found the median times to fix mass deletions and obscenities were 2.9 and 2.0 minutes, respectively.
Like the Nature study, these two studies were based on 2005 data. Since then, Wikipedia has evolved to include, among other things, automated accounts run by users (“bots”) that further help to alleviate these problems by watching for and automatically fixing vandalism. The fact of the matter is even without bots, vandals are hopelessly outnumbered by invested editors who want to see Wikipedia succeed.
Benkler describes three phases in the evolution of successful communication of this sort. First, content is created and means are developed to facilitate this creation. Second, the content is organized and vetted. Third, the product is distributed . The Internet generally renders the third phase irrelevant; such is the case with Wikipedia, where content creation and distribution are simultaneous.
A study by Kittur, et al. (2007) analyzed work on Wikipedia, braking it down into “direct” and “indirect” work. They define direct work as article edits and new article creations. Indirect work includes “meta” activities such as conflict resolution, discussion, community management, and undoing vandalism. They found increasingly, people are spending a greater percentage of their time on the “indirect” activities. Between 2001 and 2006, for example, the “percentage of edits going toward policy, procedure, and other Wikipedia–specific pages has gone from roughly 2% to around 12%.”  This indicates Wikipedia’s presence in Benkler’s second stage, in which a comprehensive set of policies, guidelines, and social norms have emerged.
The Wikipedia community makes the project work, not its collection of individual contributors, and not its technology — though it helps.
The Wikipedia community makes the project work, not its collection of individual contributors, and not its technology — though it helps. Whereas the wiki technology allows for mass archiving of activity, it is the people who value infinite transparency and utilize the archives accordingly. Whereas the technology allows anybody to easily edit, it is the community that has produced a comprehensive set of processes through which quality articles can be formed. Whereas the Wikimedia Foundation allows individuals to access its content for free, it is the community that embraces the cause and labors towards shared goal.
The functionality of well–developed social norms is the most important element in the success of Wikipedia. On the surface of Wikipedia’s policies are the “five pillars”: “Wikipedia is an encyclopedia,” “Wikipedia has a neutral point of view,” “Wikipedia is free content,” “Wikipedia has a code of conduct,” and “Wikipedia does not have firm rules” . From there, pages that elaborate and nuance these pillars number over a hundred.
Despite their breadth, these rules don’t feel overbearing to members of the community. Many follow common sense, reinforcing, for example, that Wikipedia is an encyclopedia and not “an indiscriminate collection of information” such as statistics, lists, or vacation photos. The most important reason for their acceptance, however, is that every single one of them was formed through extensive discussion, revision and consensus within the community and written by many — not put in place from the top of a hierarchy. Wikipedians, as editors are often called, have a sense of collective ownership and responsibility that applies to the structure as well as the content. Finally, built into that structure is the caveat to “ignore all rules” if they impede progress, reaffirming the ultimate goal of creating a high–quality encyclopedia.
A look at the small amount of Wikipedia policy from 2001 implies a need for editors to extrapolate from it logical everyday applications. Unfortunately, where it was not specific, people sometimes formed differing interpretations. One of the most frequently contested and revised policies is “notability,” which describes who or what is important enough to have an article in Wikipedia. This, of course, stems from the pillar of “Wikipedia is an encyclopedia.” An article about my cat, including his favorite food, how many mice he’s caught this year, and precise tail length is probably not worthy of an encyclopedia article because few people, if anybody, would find it useful. In addition to general notability guidelines, separate pages with specific criteria have been developed for academics, books, fiction, films, music, numbers, organizations and companies, people, and Web content. There is also special page for proposed additional notability guidelines. At the time of writing, this includes a tool to determine whether or not the media outlet you’re using to judge notability is worthy of Wikipedia and criteria for schools.
Individuals have their own ideas of what is or is not important, which leads to thousands of new articles about people, places, or things that may not be meaningful to the average reader. While some, like my cat example, are intended as a joke or are due to having not read any of the rules, some are blatant advertisements, and still others are good–faith interpretations. Why, for example, shouldn’t one book my friends and I love that’s “notable” enough to be sold at Barnes & Noble be included when there are so many articles for books so few people have heard of or that are even out of print? The answer to that valid question is usually that while Wikipedia is not bound by the physicality of paper, it is not “a collection of indiscriminate information.” Books that are included are set apart from the rest via verifiable published reviews, literary awards, adaptation for film, inclusion in the instruction of multiple schools, or are written by an author so individually notable that all of their written works are included.
The book notability article was created by a user in reaction to a debate over the deletion of an article for a self–published political thriller . The argument in favor of notability consisted of pointing to its banned status on Amazon.com, its availability on Google Books, a review, the existence of an ISBN number, and a claim that it satisfied the existing criteria that books can be included if they are “available in a couple dozen libraries” and are notable “above that of an average cookbook or programmers manual.”  As dozens of editors discussed the proposal for deletion, it was discovered that the proof for it being banned from Amazon was simply an Amazon search that did not return any matches. Also, all books from the publish–on–demand service it was printed with have a Google Books presence, the specific “review” was a Craigslist ad, and it could not be found at any library, including the Library of Congress. This left the fact that the book exists with an ISBN number and a claim of greater notability “than an average cookbook or programmers manual.” Since “Wikipedia is an encyclopedia” and operates under a neutral point of view, the community decided more detailed rules were needed to make sure articles weren’t easily available for any self–published propaganda, autobiography, or otherwise non–notable work.
Wikipedia works because of its dedicated community of editors. But why are they involved? Why do some editors spend as much time editing Wikipedia as a full–time job?
One possible reason is people like to see their words in print. While the idea that other individuals are reading what you’ve written can be pretty appealing, there’s a policy disallowing original research. According to this rule, everything you write has to be verifiably someone else’s idea.
There is a small element of credit that comes with good contributions, though. Each editor has a “user page,” with which she can basically do as she pleases as long as it’s loosely related to Wikipedia. Accompanying the user page is a “talk page,” as is the case with every article on Wikipedia. Whereas article talk pages serve as a place to deliberate about the content (what needs to be done, what should be removed, what should be moved, structure, organization, and so on), one of the most common uses for a user talk page is a forum in which members of the community reward hard work via words of praise and awards (usually in the form of a “barnstar” image). According to one editor, “[i]n some ways you get recognized, you get some respect, recognition from your [peers] ... here’s somebody who knows his stuff, who writes good articles and so on and so forth, and you feel happy when one of them puts a posting on your talk page.” (Forte and Bruckman, 2005)
In a world where original research is disallowed in the name of neutral point of view, people are recognized not for ideas, but for diligence, organization, and hard work performed to further a common goal. This is hardly the same type of credit that exists in, for example, scientific communities. Latour and Woolgar (1986) found that the “cycle of credit” is the primary incentive linked to publication among scientists due to its direct connection with power, efficacy, and resources (Forte and Bruckman, 2005). While credit may play a small role in explaining why people work so hard on Wikipedia, there is a far more salient motivation.
Wikipedia attracts a wide range of people who are passionate about knowledge and want to see the project succeed. Bryant (2005) looked at the way editors change with experience. New people tend to edit what they know. They may make small corrections to articles about subjects they’re interested in or address mistakes found while reading, gradually working up to more substantial revisions. Some continue this casual editing, but with experience usually comes with a greater sense of the whole. The good of the Wikipedia mission becomes more important than individual articles it contains and the community of editors morphs into something more than just a large group of people who add information. People put so much work into it because of a shared feeling that Wikipedia has value. It thrives as a gift economy in which individuals continue to sacrifice their time because they know others will invest their respective time, all for a greater good.
We have seen that sometimes — often, in fact — disputes arise behind the scenes over contentious Wikipedia content or policy. While these arguments can sometimes degenerate into petty personal attacks (which are explicitly prohibited), they are a necessary component to success. To illustrate this, Sunstein points out the highest performing companies in the U.S. tend to have argument–prone boards in which “dissent is an obligation.”  Such kinds of deliberation are successful because they reduce certain social pressures and informational influences, which cause people to keep information to themselves if they think members of the group may think less of them or if contrary information had been stated confidently. When these disputes arise, however, what determines authority?
Let’s first take a look at the hierarchy of user levels. First, there are anonymous users — those who don’t log into an account before editing, automatically signing each revision with their IP address — who can edit, but don’t have a user page, can’t create new pages, can’t edit “semi–protected” pages, can’t rename pages, and can’t upload images.
The majority of users are simply “editors;” they create, revise, discuss, revert, and otherwise form Wikipedia.
Administrators are experienced editors with access to blocking (of users) and protection (of articles) tools.
Bureaucrats are administrators that, through a republic–like system based on seeking the consensus of the community, appoint new administrators.
A few administrators have “CheckUser” or “Oversight” access, enabling them to see IP addresses behind accounts or semi–permanently remove/hide content, respectively (content is only hidden in rare cases, most often libelous statements about living people).
Finally, the Arbitration Committee acts as the last resort for dispute resolutions, analyzing situations that were not solved through discussion, requests for comment, mediation, or any other means for dispute resolution that generally precede an application for arbitration. The job of “ArbCom” is to act as judge, applying their best interpretation of existing social norms to the matter at hand. Members are determined by public election and serve three-year terms.
Except when specifically needed to deal with administrative matters, all of these users are equal; members of ArbCom do not have any greater say in day–to–day editing than an anonymous first–timer.
Since Wikipedia is a collection of knowledge, another assumption of how authority might be established may lie in subject–based expertise. If you want to know about the function of the human pancreas, you would probably default to what a doctor or professor of biology has to say and not a high school student. But in Wikipedia, there is no place for original research . A reader must be able to verify all facts, and this verifiability, not truth, is what matters. An expert may have the advantage of being more adept at finding sources in her field of study, but a high school student quite often will play a greater role in shaping Wikipedia simply because she has better researching and summarizing skills, having been brought up in a connected world.
Sometimes the credibility of a cited source may be called into question. During such deliberative processes, an expert may be valuable for their ability to steer the group towards more accurate information, but this function of expertise merely allows the expert to be of greater service to the community, not create authority.
Another problem with giving power to experts is identification — how can you tell “real” experts from amateurs? In my e–mail interview with long–time Wikipedian Theresa Knott, she responded to the idea of expertise–based authority by saying, “[p]eople spout all sorts of nonsense about who they are and why we ‘must’ listen to them. I mostly deal with it by ignoring that and only concentrating on what they are saying. Anyone can edit Wikipedia, and anyone can pretend to be anyone they like.”
Such was evident in a well–publicized controversy in early 2007 wherein one veteran editor — who had claimed to be a college professor with two doctorate degrees — was revealed as a 24–year old unemployed college dropout . The prominence of the editor involved, in combination with an interview for the New Yorker, triggered a massive assault on Wikipedia’s credibility . In response, Wikipedia founder Jimmy Wales stated, “[i]t’s always inappropriate to try to win an argument by flashing your credentials, and even more so if those credentials are inaccurate.”
I asked an administrator — identified by the handle FT2 — about his concept of authority in Wikipedia. He described his concept of “respect,” which is generated through, most importantly, actions that “further the project in some way.” Respect, he says, only goes as far as getting people to take you seriously. It doesn’t win arguments, but it could be seen as being related to authority.
Some people attempt to manufacture authority through the art of persuasion. Given the extensive use of deliberation and the ability of anybody to edit, you might assume rhetoric’s presence. After all, Wikipedia is visited by millions, many of who would have a lot to gain by grooming articles about their companies, products, or competition. Not limited to financial motivations, the Web is known for being the medium of choice for the extremely opinionated.
To some extent, rhetoric is present on Wikipedia, but it’s rare that it isn’t simply pointed out as such. Since one of the behavioral guidelines is to “assume good–faith” , users are more likely to point offenders in the direction of appropriate policies (most often neutral point of view and verifiability) than to argue back or attack.
Most rhetorical devices are based in crafting of words or exploiting some aspect of the message recipient’s psychology. These types of arguments simply have no place on Wikipedia because they don’t make for a better encyclopedia. Further than lack of respect, even if you weren’t able to extrapolate accurately from the neutral point of view principle, there is a policy that specifically states, “Wikipedia is NOT a soapbox” . This applies not just to political opinion but to all forms of advocacy.
Why would someone use rhetoric as a means to generate authority? Logic would point to a goal that is not supported by norms like a neutral point of view, notability, and verifiability. We are effectively looking at a strategy to manipulate the actions of others in a direction away from those norms — or towards a false belief that your intentions are supported by the norms. This kind of rhetoric, probably more accurately described as propaganda, causes a fair amount of grief in Wikipedia.
Fortunately, Sunstein notes, “those who know the truth, or something close to it, are usually more numerous and more committed than those who believe in a falsehood.”  This power of numbers significantly limits the potency of rhetoric because the users who are familiar with how Wikipedia works just overpower the small group who are willing to bend the rules for a particular cause.
One important characteristic of the norms as a whole is that they have been developed to apply to all articles  while maintaining a certain amount of room for common sense, calling for individuals to “ignore all rules” if necessary. They function so comprehensively that, as one of the participants in Kittur, et al’s study put it, “[t]he degree of success that one meets in dealing with conflicts ... often depends on the efficiency with which one can quote policy and precedent.”  The downside to this structure is that, like the United States legal system, it can suffer from abuse of process. Appropriately, the term “wikilawyering” was coined as a “pejorative term that refers to certain ineffective quasi–legal practices.” 
Wikilawyering is an example under the guideline entitled “Gaming the system,” which basically states “we know the rules are malleable (they wouldn’t work otherwise), but don’t do anything to hurt Wikipedia.” To be exact, “[g]aming the system means using Wikipedia policies and guidelines in bad faith, to deliberately thwart the aims of Wikipedia and the process of communal editorship.” 
Other than flagrant vandalism (due to its frequency and oft–offensive nature), gaming presents the most significant challenge to the Wikipedia community. Many times, gaming is hard to detect. Those who game the system tend to be well versed in policy and may be able to intimidate novice or even experienced editors. There are several types of gaming; they are :
- “Bad faith wikilawyering — arguing the word of policy to defeat the spirit of policy”;
- “Spuriously claiming protection, justification or support under the words of a policy, for a viewpoint or stance which knowingly actually contradicts policy”;
- “Playing policies against each other”;
- “Relying upon the letter of policy as a defense when breaking the spirit of policy”;
- “Mischaracterizing other editors’ actions in order to make them seem unreasonable, improper, or deserving of sanction”;
- “Selectively ‘cherry picking’ wording from a policy (or cherry picking one policy to apply but willfully ignoring others) to support a view which does not in fact match policy”;
- “Attempting to force an untoward interpretation of policy, or impose one’s own novel view of ‘standards to apply’ rather than those of the community”;
- “Stonewalling — actively filibustering discussion, or repeatedly returning to claims that a reasonable editor might have long since resolved or viewed as discredited, effectively tying up the debate or preventing a policy–based resolution being obtained”;
- “‘Borderlining’ — habitually treading the edge of policy breach or engaging in low–grade policy breach, in order to make it hard to actually prove misconduct”; and,
- “Reverting for minor errors — A simple form of gaming, although very common, is the tactic of completely reverting an entire revision due to minor errors, such as spelling or grammar, with a claim that the revision has errors”.
One user interviewed for this paper — who wished to remain anonymous and will be referred to as “Ed” (short for Editor) — recalled a series of events in which many of these tactics were employed over an extended period of time. According to his account, the controversy centered on a controversial type of behavioral therapy for children and a dozen or so pages on closely related topics.
The articles were tightly held by a group of six or seven editors who appeared to be supporters of these therapies. They systematically took control over the pages’ content, changing language and removing blocks of text describing any controversy. Even an article about a young girl killed while in treatment was manipulated to include factual inaccuracies, bogus citations, and a more positive attitude towards this pseudoscience.
The group of editors consistently united against anyone who altered what they had created. They were able to work around Wikipedia’s three–revert-rule, which states that one person may not undo other editors’ changes to a particular page more than three times in one day (preventing “edit wars”), by devising a well coordinated plan of revision. If anyone attempted to fight back with counter–reversions, they were promptly reported for abuse of the three–revert–rule.
When someone would point out that sources were inaccurate, they were provided with nonsensical arguments or factually incorrect refutations. Further questioning was met with statements repeated (and echoed) ad nauseum that the problem had already been addressed. Sometimes a “poll” was conducted to determine whether a section of text — blatantly unacceptable by Wikipedia standards — should be kept. The results of these “polls” were, of course, routinely in favor of “keep.” Those who contested these methods were reported for disruptive editing, vandalism, and violating Wikipedia’s policy of consensus–seeking.
Another tactic was to place warnings on editors’ user talk pages instructing them to stop what they were doing. Some new users were threatened with warnings simply for making an edit. The goal of this was, of course, to deter participation in general. Members of the offending group would even go so far as to maliciously alter small bits of text others had written in the article talk pages. This would elicit personal attacks, which were then, of course, reported. Since consensus is a priority in the Wikipedia community, and members of this group acted as one but looked to an observer as many, these actions persisted for the better part of a year.
Upon thorough investigation, the group that acted as one turned out to only be one. The user, allegedly a well–known practitioner and evangelist of a therapy complementary to the controversial one, was found to be operating at least a half dozen “sock puppet” accounts.
On several occasions, individuals had tried to fix problems on these pages by seeking help, just to give up when intimidated or feeling outnumbered. In the end, the analysis that led to the ban had three primary components: first, a thorough analysis of IP addresses revealed connections between some of the accounts; second, evidence in the form of “diffs” (linkable records of individual edits made); and, third, a behavioral analysis that discovered identical, repeated errors in spelling, wiki markup, sentence construction, and a pattern of all accounts changing such tendencies at the same time.
In looking back at the ordeal, Ed expressed a lingering bad taste primarily due to the length of time this collection of articles was held hostage and the amount of work that had to be put in to get good information to stick. He primarily blames the obscurity of the subject matter for the substantial delay to action. Of course, an obscure topic will attract relatively few page views, thus fewer potential editors.
When you combine scarce participation with a mob of abusive thugs, many users will simply stay away. While a handful of passionate people did stand firm opposite the rabble, reverting edits and arguing in discussion pages for varying amounts of time, they appeared to third parties as equally advocating the opposing view and thus the situation looked like a rather typical two–sided debate. It took a while for an experienced outsider to come into the niche community and realize that the consensus wasn’t quite pure.
In a follow–up e–mail, Ed told me the offender eventually migrated to Psychology Wiki (http://psychology.wikia.com/wiki/Main_Page), a specialized project run through Wikia (http://www.wikia.com/wiki/Wikia), a relative of Wikipedia. He would seem to have resumed the same types of practices. The repeatedly challenged inaccuracies that are now corrected on Wikipedia are obvious in Psychology Wiki. Likewise, comments on the discussion page are nearly identical in content to those in Wikipedia’s archive. Unfortunately, Psychology Wiki lacks the quality control processes and manpower of Wikipedia. There, the only recourse, according to Ed, is to endlessly revert and undo.
The end result was that the perpetrator, who for the most part did a good job of logging on with discrete IP addresses, was caught on Wikipedia — even if it took a while. This is a testament to the strength of Wikipedia’s community, succeeding because of its well developed set of norms and a user base consisting of far more people who want to see the project succeed than those with an agenda.
When I asked FT2 whether he saw the “Gaming the system” article, which he initially created, as another — perhaps final — layer of policy, effectively catching all of the problems that the rest missed, he declined. “It’s a simple and obvious expansion on core principles ... when we write a policy or guideline, it’s there to benefit the project not to harm it.” If you understand the basic ideas behind Wikipedia, or perhaps even if you don’t, the principles in “Gaming the system” makes sense. It’s a reminder not to do bad things.
But there are at least two key things we derive from “Gaming the system”. The first is a keen sense of self–awareness and desire to evolve and progress on the part of the community. By spelling things out and giving examples, it provides clues for the identification of gaming activities. It also serves as a deterrent, giving clear specifics as to what is unacceptable.
The second important characteristic is its conveyance, more than any other policy, of a spirit behind the policies and guidelines that represents the community. While “ignore all rules” exists to describe a similar ideal, it’s too abstract and too often misinterpreted or misused, itself a constant subject of gaming. “Gaming the system” makes explicit the essence of “common good” that drives Wikipedia.
FT2 concludes that “Gaming the system” exists because:
“The real lesson isn’t in writing a page that says not to [misapply the rules]. It’s in the gradual learning within the community of those situations so that in a few years, things that would have been a problem game–wise, will be ... better known and the game thus less effective. That’s the real learning. A page of text is just one aspect of it ... If you deleted the page on ‘personal attacks’, the [belief] that personal attacks aren’t okay would still exist as widespread communal knowledge, and that’s what counts. That is the real policy and place it belongs.”
Wikipedia, through an ecology that places its betterment first and foremost, reduces the basis of authority down to neutral editing skills and the ability to state clearly, citing consensus–based policies and guidelines, why something will or will not improve the quality of the project. This effectively quashes the concept of rhetoric not just in obvious terms of article content, but also as a component of discourse. There just isn’t room for arguments that have a net negative effect on Wikipedia. This is why “Gaming the system” does a great job of conveying the spirit of the rules: it doesn’t matter what you read; if you’re not working towards the common good, you are in the wrong.
While some people use the Web to create a “Daily Me” of personalized communications and information, Sunstein notes the emergence of a “Daily Us, a situation in which people can obtain immediate access to information held by all or at least most, and in which each person can instantly add to that knowledge.”  It is this us–ness, the consensus–based, continuously tested and revised set of shared social norms based not on the benefit of individual members, but on a greater good, that makes Wikipedia work. As “Gaming the system” both clarifies and exemplifies, these norms are best represented by an underlying spirit and a common goal. The collection of written policy documents are just intended to steer you towards it.
About the author
Ryan McGrady is a New Media Studies graduate student in the Department of Visual and Media Arts at Emerson College (Boston, Mass.).
Email: ryan_mcgrady [at] emerson [dot] edu
1. Benkler, 2006, p. 60.
2. Sunstein, 2006, p. 152.
6. Priedhorsky, et al., 2007, p. 7.
7. Priedhorsky, et al., 2007, p. 8.
8. Viégas, et al. , 2007, p. 3.
9. Benkler, 2006, p. 68.
10. Kittur, et al., 2007, p. 3.
13. Op. cit.
14. Sunstein, 2006, p. 201.
17. Stacy Schiff, 2006. “Know it all: Can Wikipedia conquer expertise?” New Yorker (31 July), at http://www.newyorker.com/archive/2006/07/31/060731fa_fact.
20. Sunstein, 2006, p. 154.
21. A hundred or so policy– and guideline–based articles doesn’t seem so outrageous when you consider that they apply to several million pages.
22. Kittur, et al., 2007, p. 2.
25. Op. cit.
26. Sunstein, 2006, p. 219.
Alexa Internet, 2008. “wikipedia.org — Traffic details from Alexa,” at http://www.alexa.com/data/details/traffic_details/wikipedia.org, accessed 25 June 2008.
Yochai Benkler, 2006. The wealth of networks: How social production transforms markets and freedom. New Haven, Conn.: Yale University Press.
S.L. Bryant, A. Forte, and A. Bruckman, 2005. “Becoming Wikipedian: Transformation of participation in a collaborative online encyclopedia,” Proceedings of the 2005 international ACM SIGGROUP conference on Supporting group work (Sanibel Island, Fla.), pp. 1–10; version at http://www.cc.gatech.edu/~asb/papers/bryant-forte-bruckman-group05.pdf, accessed 27 January 2009.
A. Forte and A. Bruckman, 2005. “Why do people write for Wikipedia? Incentives to contribute to open–content publishing,” at http://www.cc.gatech.edu/~aforte/ForteBruckmanWhyPeopleWrite.pdf, accessed 27 January 2009.
V. Gijsbers, 2004. “Ideals of knowledge: Media from Plato to Wikipedia,” at http://lilith.gotdns.org/~victor/writings/0051ideals.pdf, accessed 8 December 2007.
Jim Giles, 2005, “Internet encyclopaedias go head to head,” Nature, volume 438, number 7070 (15 December), pp. 900–901, and at http://www.nature.com/nature/journal/v438/n7070/full/438900a.html, accessed 27 January 2009.
A. Kittur, B. Suh, B.A. Pendleton, and E.H. Chi, 2007. “He says, she says: Conflict and coordination in Wikipedia,” CHI 2007: Proceedings of the ACM conference on human factors in computing systems, pp. 453–462.
Bruno Latour and Steve Woolgar, 1986. Laboratory life: The construction of scientific facts. Princeton, N.J.: Princeton University Press.
R. Priedhorsky, J. Chen, S.K. Lam, K. Panciera, L. Terveen, and J. Riedl, 2007. “Creating, destroying, and restoring value in Wikipedia,” at http://www-users.cs.umn.edu/~reid/papers/group282-priedhorsky.pdf, accessed 27 January 2009.
C.R. Sunstein, 2006. Infotopia: How many minds produce knowledge. New York: Oxford University Press.
F. Viégas, M. Wattenberg, J. Kriss, and F. van Ham, 2007. “Talk before you type: Coordination in Wikipedia,” Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS ’07), p. 78a; version at http://www.research.ibm.com/visual/papers/wikipedia_coordination_final.pdf, accessed 27 January 2009.
Paper received 10 July 2008; accepted 20 January 2009.
“Gaming against the greater good” by Ryan McGrady
is licensed under a Creative Commons Attribution–Noncommercial–Share Alike 3.0 United States License.
Gaming against the greater good
by Ryan McGrady
First Monday, Volume 14, Number 2 - 2 February 2009
A Great Cities Initiative of the University of Illinois at Chicago University Library.
© First Monday, 1995-2014.