Digital Representation: Racism on the World Wide Web
First Monday

Digital Representation: Racism on the World Wide Web by Indhu Rajagopal with Nis Bojin

Digital Representation: Racism on the World Wide Web by Indhu Rajagopal with Nis Bojin
This paper argues that the various rhetorical modes in which hate is expressed on the Web are tailored to the types of messages offered. The unique technologies of the Web, that differentiate it from the earlier media of communication, facilitate the various rhetorical modes. The Web, as an unregulated medium, fosters the worldwide dissemination of both 'actionable' and 'non-actionable' hate messages. The actionable hate messages, regardless of their intensity and potential to excite violent actions, are not legally restricted through any international censorship regulations; the power of restricting such messages is national, if such messages counter national laws and conventions. The questions explored here are: Does the Internet and the Web facilitate the spreading of hate messages? Should Internet hate materials be regulated? If so, how might that be done? What criteria should be used to differentiate between hate and non-hate materials? Is it possible to draw and enforce a line between hate and non-hate messages? What input would measures against hate messages have on the Internet culture itself?


A Conceptual Framework
Hate Messages and their Rhetorical Forms
Why the Web? Hate and its Technological Transformation
Regulate the Web?





Racism violates fundamental human rights and human dignity. Equality of rights and human dignity are enshrined in the Universal Declaration of Human Rights (2002). More importantly, the declaration ensures protection against any incitement to racial discrimination (Article 7). When disseminated electronically, racism incites actions that would destroy racial equality. Civil libertarians argue that such equality cannot be protected against the rights of the Internet as a medium to carry free speech and expression. Racism is predicated on the assumption of racial superiority of a group in society. Racia1 exclusivity and superiority are attributed to a group, based on ascriptive characteristics of physical and cultural differences to people and groups, while the excluded groups are regarded as inherently inferior to the racially privileged group. Racism is a belief that expresses itself in action or behaviour as discrimination against the excluded groups.

Rohit Barot and John Bird (2001) explain that sociologically the late nineteenth century terms 'race' and 'race relations' have evolved in discussions into racism as an ideology and 'racialization' as a process attributing ascriptive physical and cultural characteristics. When racism becomes an ideology and race is given ascriptive features, it excludes people or groups from general society, and makes them outcasts. In her examination of the definition of hate crimes, Katharine Gelber (2000) expands its compass, and argues that, if policies governing hate crimes were to be effective, definitions of hate must also include gender and sexual orientation.

We argue here that the various rhetorical modes in which hate is expressed are tailored to the types of messages on the Web, and the unique technologies of the Web that differentiate it from the earlier media of communication, facilitate these rhetorical modes. The Web, as an unregulated medium, fosters the worldwide dissemination of both 'actionable' and 'non-actionable' hate messages. The actionable hate messages are those against which legal action can be taken and censored; non-actionable ones are those that flourish on the Web, legally untrammelled. The actionable hate messages, regardless of their intensity and potential to excite violent actions, are not legally restricted through any international censorship regulations; the power of restricting such messages remains national if it counters the national laws and conventions.



A Conceptual Framework for Analyzing the Forms of Hate Rhetoric and their Technological Transformation

We frame our discussion [1] in a format that lends itself to an understanding of the role of Web technologies in the representations of racism on the Internet. There is ample evidence in the literature that denigration of visibility and race in the broader popular culture and traditional media have been carried over to the new technology of the World Wide Web (WWW or Web). The unique technologies of the Web that differentiate it from earlier media facilitate various rhetorical modes, and transform the Web into a disseminator of both actionable [2] and non-actionable [3] hate messages.

On the Web, we can broadly discern five types of hate messages [4]: civilized messages, humorous and light-hearted quips, simple and persuasive appeals, claims of self-preservation, and product advertisements. The comparable rhetorical forms [5] these messages take are: educational narrative, stealth images and dialogue, coded metaphor, survival discourse, and marketing rhetoric. The technological features [6] of the Web that facilitate and empower the rhetorical modes are: flexibility, mass customization, allure, digitalization, the unexpected pop-up messages, and the worldwide reach of unregulated freedom and access to information. A discussion of the concepts and themes of the framework, and figures illustrative of these themes, selected from the Web, follow.



Hate Messages and their Rhetorical Forms

Educational Narrative

Hate messages can appear as images or as narratives in very progressive, uplifting, and decorous sites, explaining the nature of, and reasons for, racism. Often, racist images are generally easy to access online on educational sites. We found some images (Figures 1, 2 and 3) at a Web site with progressive messages and totally unrelated to any identifiable hate site (Holdt, 2002). The narrative in american, where the Nazi images are found, explains the cause of racism as poverty, and expands on the thesis to educate the surfer. There are also other Web pages where organizations deliberately plant hate messages that can be transmitted deceptively to unwary surfers as bearers of "civilized message," without being associated with the imagery of hate activity. They do not post racist images on their own sites. The structure of the Internet and its tools are honed to whet the interests of the young, and entice the unwary to join an interest group, without revealing any racist information. Thus they pave the way for their recruitment in good time. It may take a long time, but eventually these groups may recruit members globally and online in one way or another. Further, organizations or individuals who sponsor these sites that contain racists' images and narratives, may also contribute to the legitimacy of racist causes, and facilitate public consumption. Lynn Thiesmeyer, therefore, warns: "The Internet has appealed to the dual requirements of users of any medium: the desire to find somewhat knowledgeable or believable information, and at the same time to receive it in an easily digestible form" (1999: p. 121).

Stealth Images and Dialogue

Images and graphics that illustrate messages are very persuasive, and humour conveys information quite subtly and effectively. Racist humour on the Web looks innocuous, as it seems to project a sense of victimization of the racist groups in disseminating hate. Racial misconceptions are convincingly portrayed in comic strips on hate Web sites. Some of the racist cartoon images may serve a double purpose by adding a sexist aspect as well. A cartoon illustration and racist narrative in the bubble (Figure 4) try to convey racists' paranoia. The cartoon is coarsely designed to depict that the deliverance and protection of white women are in the hands of heroic white males (Stormfront, 2001). These messages portrayed in cartoons may persuade some who are already fearful of likely rapes and sexual violations that are reported in traditional media. Since individual experiences vary, so does sensitivity to the images in advertisements: "What one person may see as extremist may appear to be quite innocuous to another, and this perhaps is the crux of the ethical argument" (Craven: 1998: p. 6).

Coded Metaphor

Lynn Thiesmeyer observes: "Depending on the targeted audience, some of [the hate sites'] Web transmissions are blatantly racialist, while some others are "coded" or euphemistic. No matter how long you search the Web sites of the National Socialist Movement of Denmark, the German Workers' Party's Overseas Organization, the Zündelsite in Canada, or the Pan-Aryan Resources Center in the United States, you will seldom find direct hate speech exhorting you to attack members of certain ethnic groups. You will [however] find it in Usenet e-mail. That kind of legally actionable speech is reserved for the more ephemeral listserve transmissions, normally to in-groups" (1999: pp. 118-9). Female-managed hate Web sites operate in many ways as entities separate but parallel to the generic racist sites. The Web can be harnessed to reinforce a group's message, and even to instigate violence on a local or an international scale through worldwide overt or coded messages. The image (Figure 5) in this site is digital art for a female Nazi organization's Web page (Goldman, 2002). The girl's uniform and the background picture serve as a metaphor for the resurgence of Aryan sensibilities and anger against the historical defeat of Nazism. In contrast to the above metaphorical images, more overt portrayals of racist violence are also found on the Web. In a newsgroup discourse (Figure 6), female Skinheads are seen beating up a Black female; this was located in a forum posting on a message board for commentary by visitors. However, it has since been removed, perhaps because it brought much attention and a threat of legal action. While the message board's purpose in itself is unrelated to promoting hate, clearly racist images are hidden but actively coded in its pages. Another explicitly derogatory picture located at a former Web site entitled Hammerskins (Figure 7) is an uncouth example of what Web page visitors could contribute to a site's cause and to the like-minded Web community (2001). It is one of a series of home photos sent to a racist site to show support for the skinhead cause. Racist groups could easily communicate their messages metaphorically through Web technology that facilitates 'coding,' and transmit their cause equally effectively. However, they often do not have to resort to such overt messaging.

Survival Discourse

Figure 8, located at a now defunct Web site, represents the racist notion of self-preservation (14 Words, 2001). The Aryan Supremacist group promote preservationist movements, one of which is called "14 Words." The Aryan society declares on the Web: "We must secure the existence of our people and a future for White children." This illustrates once again how racism blends with sexism. The call here is not to women, but to men to procreate and take the initiative to preserve the whites as the dominant race. The message reads: "to ensure reproduction ... nature made the urge for sexual union the strongest male instinct" (emphasis added). These messages and images may be conveying personal experiences - of sexual insecurity for example - in the local arena and worldwide, which in turn, may trigger similar responses in digital media.

Marketing Rhetoric

In the name of art and customized Web page creation or simply as commercials, hate sites provide something unique to the Web: digital banners and side bars. These are distributed for posting on other sites in order to attract greater traffic. However, the sites that post them may be unrelated to hate. It is clear that the images are made for use as bumper stickers on a personal Web site. Hence, hate-related advertisement banners may appear on Web sites unrelated to hate messages. Many of these banners for racist Web design were located at Stormfront (2001), Micetrap Distribution (2001) (a retail seller of racist products) and the currently out-of-commission Candidus Productions (2001), which offered artwork design for the racist community (Figures Figure 9, 10 and 11).

Many racist or hate Web sites offer commercial information for visitors interested in purchasing racist memorabilia - for instance, a "KKK white power flag" or "racist music" that is quite commonly presented on a variety of hate Web sites. The legitimacy of the Web medium and access to its commercial aspects advance the worldwide sale of goods that are otherwise not easy to distribute on such a scale. The type of music available on the virtual terrain of Web pages would not be available at real stores. Examples from the Web sites 88 Music (2001) and Micetrap Distribution (2001) show the flexibility that the Internet offers the hate scene (Figures 12 and 13). Shane Borrowman (1999) believes that "on the Internet, hatred can thrive. It is the [freest} press imaginable, and Holocaust deniers can publish their works as widely as they like - or as widely as their hardware and software allow. Although their message on television, when it is there at all, is mediated by a reporter, producers, or a talk-show host, the deniers' message can be constructed on the Web in any way they choose" (p. 45).



Why the Web? Hate and its Technological Transformation

The question 'why the Web?' can be answered only if we examine the technological tools of the Web and understand how these tools are being used to distribute hate messages. Unlike older media such as radio or television, many technological processes are unique to the Web. The Web is a digital medium that provides a pliable technology allowing the transfer of information - as well as pictures - with great ease. It allows for manipulation of written text and images with minimal cyber skills. The viewer needs few technical skills because any computer user with access to the Internet will be able to browse the Web with a mouse click. In some instances, children or adults may come upon these Web pages accidentally in their browsing. "The Internet reaches into places where hate groups have never gone before: homes, offices, schools. Attractive Web sites can drape fringe ideas with a certain respectability. The medium is particularly well suited for reaching angry social outcasts, who find comfort in the Web's anonymity" (Economist, 1999a). We will discuss the following unique features of the Web which facilitate multiplication and spread of hate messages with ease: Web's flexibility, ease of mass customization and use of low-tech tools, enticement of downloads, appeal of clip-and-paste banners, the dynamics of unexpected pop-ups, the scope for merchandizing ploys, and most importantly, unregulated and unrestricted access to information.

Web's Flexibility

The structure of the Internet and its tools make the dissemination of messages on the Web very different from other ways of communicating ideas. The medium's interactivity is tailored to the convenience of viewers who have the flexibility to join or leave the newsgroups at will. Examining Raymond Franklin's "Hate Directory" (2002), we note that Web sites set up by hate institutions such as the Ku Klux Klan (KKK) not only have a massive Web-based presence, but are also spread throughout File Transfer Protocol (FTP) sites, newsgroups, mailing lists, Internet relay chat (IRC) channels, and even Yahoo clubs. The Internet is a "dangerous new ingredient" in the propagation of hate, largely because it is inexpensive and flexible (Economist, 1999b). It carries everything from racist online radio shows to downloadable racist software. All these forums can be readily accessed by someone who types in "race" or "white kids" or hate in a search engine on the Internet.

Ease of Mass Customization and Use of Low-tech Tools

Messages on the Internet can be mass-customized and multiplied at low cost. Mass customization allows various ways of customizing messages and images with minimal effort. One can set up a directory of hate with access to various digital tools, as blatant images or as a hidden and enticing dimension of the Web for purposes of recruiting followers, and mass dissemination of information. The message transmittal may be as innocuous or as loaded as the disseminator would want it to be. It may be in the form of comic or cartoon strips, or of expressions of sex-related insecurity, or it may appear in the guise of artistic expression, or of recruitment of membership in an exclusive club. The salient factors that contribute to the Web publication, as well as to the spread of hate messages and content on the Internet, are the same as those that have made the Internet so popular and useful. One need no longer learn HTML to construct a Web page. HTML editors have made it quite easy to create Web pages and sites. Anyone with a message can easily access a functional demo of Coffee Cup, Homesite, Dreamweaver, or other software, and create a Web page. In addition, the purchase of server space to host a Web site is quite inexpensive, and only a small amount of money is needed to acquire domain names.

Enticement of Downloads

Web technologies in audio compression have advanced so rapidly that audio trading online has become quite popular. Prior to the advent of MP3 compression, to distribute high-quality audio, one would have to send ".wav" audio files, with songs reaching sizes of 60-100 MB. The only alternative was Real Audio which was of an inferior quality, with smaller size media files for downloading. For users with access only to low-bandwidth, the quality of sound in the streaming files was low. However, with MP3 media technology (with the average song around five megs large), cable/DSL interconnectivity, and tools that allow MP3 encoding, hate sites can now distribute songs and audio files (such as speeches and lectures) on the Internet. With the increase in users' average bandwidth, videos are becoming increasingly popular as well. These very improvements to Internet access and functionality have enhanced the potential for tools that hate sites can use for posting and promoting hate music, speeches and videos to be downloaded at their leisure and with very little difficulty. The tools for encoding MP3s are usually bundled with most computers or computer hardware; they are also available for downloading on the Internet. Graphic images are just as easily posted online without any need for digital art skills. Any real-life image can be scanned into a computer and posted on the Internet within minutes, without expensive equipment or intervention by a professional.

Appeal of Clip-and-Paste Banners

There are a variety of promotional utilities on the Internet that facilitate hate dissemination. These are practical devices that were originally engineered to help the common surfer or Web publisher. Non-commercial, cross-promotional banner ads on Web sites have encouraged many with common interests to share traffic, and whetted the interests of visitors for locating sites on related topics. Banners are strips of digital real estate used on Web sites to advertise another site, a product, or service. Hate sites often have banners leading to similar sites, establishing a network of hate sites, each promoting the other's agenda. An even more elaborate method of linking sites with similar interests consist of Webrings, which are complicated, multi-linked banners located on pages participating in a Web-link program; anyone who clicks on the banner is whisked away at random to another site on the Webring's list. Viewers are also moved up to a list in order to be guided to their next destination. But, as handy promotional tools, Webrings offer racist and xenophobic hate sites traffic and access to broad audiences with little effort. To sign up for participating in, or hosting, a Webring is effortless and cost-free, as any other hosting of services, such as a club with Yahoo!, or a chat channel.

The Dynamics of Unexpected Pop-ups

What may be most disturbing, however, is the role that serendipity could play when young kids search for items on search engines, and mistakenly misspell words that may lead to hate sites. Search engines - the Internet's answer to an index or a directory - group and categorize sites on the Web. Only a few engines attempt to filter out racist and sexist sites from their listings. Submitting one's site to a search engine is quite easy, and accessing that site is even easier. For instance, typing in the two words "storm" and "watch" won't bring up meteorologically pertinent sites; instead, the Stormwatch white-pride site will appear immediately.

Usually a Web site intending to market a message or product will attempt to provide a common theme or approach in order to target and reach a specific market. When you target an audience that is the entire white population of the world or the entire non-Jewish or non-visible minority population of the world, mass customization is the key for success. The larger, more successful Stormwatch and KKK sites all have something in common: they attempt to appeal to every conceivable audience via entirely different sites or Web site sections. At sites such as that of Stormwatch and the various white power sites, one can identify very separate and individual hate sites or site sections intended for very particular audiences, to aid in the communication of their message to all groups, ages, and genders. Some of these are geared towards kids, some are made for teens with the use of new-age hate music and new media, and others are constructed and styled for women, or are aimed towards parents or so-called "patriots". Of course, there is always a hate site or section catering to a general, and often male-dominated, audience. Far different from the older, more aggressive sites, the newer and more elaborate Web pages use tact and diplomacy, in an attempt to pre-empt attacks from critics. Assuming the role of a victim, they evoke sympathy and falsely communicate a keen desire to dispel the "myths" of violence associated with their groups.

The Scope for Merchandizing Ploys

Beyond the level of age and gender demographics, the racist groups mass customize their messages to lure unwary visitors. People who shop for racist items can find them at the KKK gift shop online. Others will find answers in Frequently Asked Questions (FAQ): Want to contact someone on a personal level? Do you have only intermittent e-mail access? Try the mailing address that is provided. All levels of visitors are catered to in the various Nazi and White Pride sites. Often, the mere variety of different groups' Web sites reveals how simple it is to mass customize messages, and to conduct other activities of the online hate movement. The medium offers choice and personal customization for a site viewer. If you're interested in music (as youths generally are), there are more than enough songs to choose from, along with complete texts of racist lyrics. Comics and drawings are also online, promoting racist jokes, and the sale of physical objects ranging from videos to T-shirts. Images of children and young teens disseminating racist messages are also increasingly frequent on the Web, and they are being shown as the voices of the future, ready to grasp their responsibilities when their turn comes.

Unregulated and Unrestricted Access to information

The Internet is unregulated and completely unrestricted in regard to the information it disseminates. Hate messages are spread worldwide both overtly and covertly through this medium. In popular culture, hatred as a feeling is generally disliked, and often remains hidden to the naked eye. What difference does the new technology - the Internet - make for expressions of hate?, a Web site that has been a significant resource for our research on hate, had earlier announced its exit from the Web, but has since returned to continue fighting hate sites and to disrupt the efforts of hate sites to recruit members online. David Goldman, the executive director of HateWatch, explains the importance of establishing a Web page: "In 1995, when HateWatch first began examining the issue of online hate, popular thought suggested that the Internet would provide an enormous boost to hate groups seeking to recruit new followers. From the beginning, these organizations' self-proclaimed desire to create a digital "white revolution" was carefully monitored and documented by civil rights organizations, HateWatch among them" (Goldman, 2001).



Regulate the Web?

Of all the unique features of the Web discussed above, the most significant in keeping hatred resilient online is its unregulated nature. Unrestricted freedom to place and spread hate messages worldwide may not only be used to express one's thoughts in the form of prejudice, but also to instigate racist actions and violence, targeting and victimizing groups labelled as 'inferior.' As Manuel Castells (2001) puts it, unrestricted networking is the essence of this new technology. Censorship is almost impossible, as it is easy for anyone to go to a different server to overcome restrictions in one part of the world. However, as a direct response to protecting intellectual property rights and commercial interests, new software architectures have been developed. Freedom in the Internet terrain is being fought at a variety of levels, includng commercial, civil, intellectual, social, and individual.

Castells also notes that there are ways of monitoring, if not regulating, Web sites. There are two sets of technologies that are used for monitoring, 'technologies of identification' and 'technologies of investigation.' Technologies of identification trace the authentication procedures that use digital signatures. These signatures access other computers to verify the origin, and retrieve other signature characteristics of the communicator. Surveillance technologies are another dimension of the detectors that intercept messages, and insert markers that would allow tracking of other computers' activities. 'Technologies of investigation' that can record and build information resulting from surveillance is another way to monitor activities. The disadvantaged parties are those who do not control this technology and cannot know that they are being controlled. Although the entry is global, the control is not.

Next, a legal distinction is made in many countries between advocating hate (hate advocacy) and manipulating hate expressions in such a manner that would incite violence and crime. Brian Levin (2002) suggests that governments can regulate hate expressions that constitute crimes rather than those that are communicated via the Internet as ideas and opinions. Although free speech is guaranteed, expressions involving criminality are not. Hate victims and anti-racists have invoked those unprotected areas that do not fall under free speech to convict hate criminals. One such unprotected area is when a message incites criminal action. The U.S. Supreme Court has clearly drawn the demarcating line between freedom of advocacy and inciting criminal action in Brandenburg v. Ohio (1969). If digital advocacy goes beyond legal limits and contains expressions that aim to create lawlessness and are likely to incite crimes and other illegal actions, then those postings can be successfully prosecuted. It is easier for private Internet Service Providers (ISPs) than for the government to ban hate in any form. Through the terms of service that users must sign up, private ISPs can regulate or prohibit entirely hate, fraud and other offensive postings on the Internet. If the users do not abide by these rules, service can be cancelled. Since 11 September 2001, at least in the U.S., cyberterrorism and the Patriot Act [7] have drawn some attention to the need for restricting the Internet from being used for hate and terrorism. Levin considers the time is now ripe also for regulating hate-mongering.

Internet portals can be sensitized to the need for eschewing violence and hate mongering against minorities and historically marginalized populations. Lisa Nakamura (2000: pp. 39-50) points out that currently Internet portals are owned by businesses interested solely in promoting commerce and monied interests. In this economy, the interests and protection of minorities and marginalized groups are not given a priority. As a result, it is much harder for these groups effectively to counter hate-mongering sites by creating their own sites with equally powerful messages that would divert the surfers to their counterarguments. The Anti Defamation League (ADL) has successfully established accessible and attractive resources on its site. Information on 'Poisoning the Web' and the ADL HateFilterTM were created to help children and others surf the Net safely, to distinguish good messages or information from bad ones.

Lastly, it is essential to remember that all expressions and actions are based on, and reflective of, conditions and behaviour in the broader society. Therefore, any changes to or regulation of, the Internet have to be initiated by the society at large (Tyler, 2002). Terrorism worldwide, and particularly that directed against the U.S., has sensitized lawmakers on the need to bring in pre-emptive laws and regulations to fight hate and the resultant crimes. It is probably likely that hate sites will come under closer scrutiny in the near future.

Actionable Hate Material

How do we legitimately differentiate between the good and bad use of advocacy on the Web? The bad ones are the libellous speech or infraction of copyright laws or harassment accompanied by a 'course of conduct.' A single incident or a set of messages that do not expose the identity of the offender makes it difficult to detect and enforce against such crimes. However, there are exceptions. In 2001, a right-to-life group was found legally liable and required to pay over $100 million in damages because of its use of the Web to transmit its message, in the so-called "Nuremberg Files" (see Planned Parenthood of the Columbia/Willamette Inc. v. American Coalition of Life Activists (2001)). There are other legal precedents for decisions against hate threats via e-mail.

Indeed, many countries - Canada, Germany, France, Britain, and Denmark - have recognized the distinction between advocacy and action on hate. They have criminalized hate crimes and extremist, actionable, hate speech on the Internet, and have brought charges against individuals and groups committing such crimes. However, if the offenders are American citizens, if there is a constitutionally protected activity (freedom of speech), the U.S government will not extradite its citizens to other countries even if other countries demand extradition.

The demarcation line between actionable hate and non-hate content on the Web is easily drawn in the private sector. If a private ISP firm prohibits users from engaging in such hate messages, and enforces the regulations by refusing to be a conduit for hate messaging, the racist hate site can always go to a permissive service provider. Thus, it becomes more difficult to monitor and enforce such messages and sites. Ivan Fong, Senior Counsel, E-Commerce and Information Technology, General Electric anticipates that the U.S. Congress will legislate in this area giving various incentives to the private sector so that there could be collaboration between the public and private sectors in pooling cyber-security data. Professor Dan L. Burk [8] of the University of Minnesota Law School suggests that, at the international level, the United States government has to negotiate treaties to shape Internet laws on commercial, individual rights, and terrorist matters. One such treaty that the U.S. business sector pushed for in 2000-2001 was enforcement of legal judgements across international borders. U.S. businesses hoped that such a treaty would empower them to demand enforcement of their judgments in foreign countries. However, the case of Yahoo! - which was charged with selling Nazi memorabilia considered illegal in France - was a sobering experience for the U.S. businesses utilizing the Internet. U.S. businesses have to understand the reciprocal implications of such a treaty and the effect it might have on their intellectual property rights and freedom of expressions (Kaplan, 2001).

Civility and Censorship

Since the Web remains an uncensored and unregulated medium, and some significant countries such as the United States have not yet ratified international human rights provisions [9], it is not easy to curb effectively neither actionable nor non-actionable hate on the Web. In this context, we will assess the desirability of other tools that are often advocated for regulating the dissemination of hate on the Internet, such as censorship, presenting oppositional arguments exposing racist fallacies, technical monitoring of illegal messages, and setting up global criteria against defamation. In 1999, Raymond W. Smith, the then Chairman, of Bell Atlantic Corporation, in his address 'Civility without Censorship' (1999) noted that cyberhate pollutes the Web and hinders public discourse on race and ethnicity. However, he did not believe that legislated censorship is the right tool to exorcise such matters from the Internet. He strongly affirmed the need for full freedom of expression on the Web, and for effectively turning the medium itself against bigotry and hatemongering. He suggested two requirements - posting messages to forums of minority communities and cultures on the Web, and facilitating minority groups' access to the Web - for purposes of countering defamation. This would involve drawing in groups, usually the minorities and the poor, that have little access to computers and hence to the Internet, and empowering them to disseminate genuine and positive messages. As Lisa Nakamura (2000) points out, non-whites do not have equality of access to the Internet or Web messages or Web building as the whites do. The Web is indeed a place where race is shaped by digitalization, search engines, ranking of accessible hypertexts, downloadable features, and other aspects peculiar to this media.

Ulrich Sieber (2001) advocates three tools - technical, legal, and diplomatic/cooperative - to regulate hate on the Web. The most effective means is to make hate messages inaccessible through host providers. As different jurisdictions internationally may not recognize the actionable form of defamation, it is not easy to criminalize them if the messages are protected under various national freedom provisions. For instance, marketing Nazi memorabilia is protected under the First Amendment in the U.S., whereas it is illegal in Germany. Therefore, eschewing such hate expressions must be done through international cooperation or other non-legalistic tools. Sieber also emphasizes an educational approach across the globe for differentiating between free speech and criminal defamation in order to evolve a code of conduct for the Web that would set limits on what is acceptable and what should be screened. Elissa Lee and Laura Leets (2002) emphasize that it is crucial that the means are available for unwary surfers, especially adolescents, to identify the forceful strategies and persuasive messages of hate groups, and to equip them with tools to discern the good from the pernicious messages. Web surfers will need to have astute critical and literacy skills to sift the chaff from the grain.

There are several barriers, however, to overcome while drawing the dividing line between freedom to advocate hate and freedom to incite hate violence. First, in the United States, it is difficult not to tread on the tenet of freedom of speech under the First Amendment of the Constitution when differentiating between hate speech with criminal implications, and advocacy of a cause. The U.S. Supreme Court has declared that the government cannot censor freedom of speech whether it is expression on the Internet or public speech or information on print or broadcast media. All Web sites in the U.S. are viewed within the traditional limits of the Constitution.

Second, to monitor and differentiate the offensive and criminal messages from others that may be unpalatable but legitimate, it is critical to set up global criteria. This means that all involved states have to come together. Manual Castells (2001) notes that nation states may feel that their sovereignty is being eroded by the Internet's encroachment. Since the Internet is global in its reach, states have to share their responsibilities, to emerge as controllers of a global network. As with the global terror network since 11 September 2001, global monitoring and sharing have become necessary for eliminating extreme elements that cut at the root of freedom itself.

It is, however, encouraging to find private Web sites like ADL that take upon themselves the burden of helping the public to understand the contrast between their position and that of the typical hate sites. "The Web of Hate," "101 Ways to Make Your Community a Prejudice-Free Zone," and "What to Tell Your Child about Prejudice and Discrimination," are examples of informative content on a Web site. ADL's HateFilterTM allows parents and other users to screen out hate sites. The Learning Company (TLC) and ADL have cooperated to produce another filtering software based on CyberPatrol®. These programs provide an option of looking for alternative explanations about the extremist and violent views. This gives one a tool to examine Web-base information which some indiscriminate surfers might confront unintentionally.

Vulnerabilities of the Internet Culture

What would be the impact of control or regulatory measures against hate on the Internet culture itself? Indeed, as Ulrich Sieber (2001) alerts us, the Internet's versatility and worldwide use make it vulnerable as a tool for hate. In response, strategies and mechanisms are emerging to provide some protection from these messages. These strategies will affect the Internet itself and the ways in which it is used.

It is evident that those who technically control the servers, networks, and Web pages that make up the Internet and the Web could have overarching power on what should and should not be allowed on this medium. This sort of responsibility will certainly affect the content of the Internet, if there are international legal rules that provide a framework for any action. In addition, it is not easy to locate dangerous individuals or groups due to the anonymity provided by the Web, or to enforce legal strictures against hate and harassment. Greater controls over content, with more details on sources of Internet content, could alter this situation.

Cass Sunstein (2001) argues that the radical fragmentation of political debates and discussions on the Internet will lead to the slow but steady disintegration of democratic exchanges and infusion of American democratic values. Denied of this open and free forum, filtering techniques will allow the selection of special interest sites, instead of dialogues on differing views and conflicting ideas. This may negatively boost extremists' causes, because groups with similar interests, unexposed to other points of view, tend to become extremists. To avoid this dangerous course, opponent sites should follow voluntary democratic codes of conduct, and thus enable the Internet to provide a link between oppositional views on the Internet. There should also be a disclosure of the level of public interest that sites promote, as well as the establishment of public sites subsidized by the government to ensure availability of information on, and protection against, hate sites. The Economist (2000) notes that these measures may turn out to be intrusive and controlling. The experience with the Internet so far has been that it has served more as a democratic forum than as a dictatorial enclave, since information on the Web has not yet overtly become anyone group's monopoly.

How do the various regulatory measures reviewed above shape the ways in which race is depicted or viewed on the Internet? The Web reasserts, and sometimes in extreme forms, the existing racial views and popular interpretations of race in the broader society. The colonialist gaze sets up the racial image of 'those people' or 'those savages.' White people are often portrayed as, and assumed to be, a 'default option' for participation in any interaction. To counter malicious racist labels and messages, Raymond Smith (1999) emphasizes that the Web is the ideal medium to disseminate information about the religious, ethnic, racial, and other pluralities of the nation by deliberately garnering Web resources for political, educational and social purposes.




The Web, like other media, can serve as the 'producer' and also as a 'product' of social inequality. Hate sites privilege certain groups, and believers in racism promote the strategy that they should exclude others as lesser beings. The definition of "American people" does not seem to include Blacks (Figure 14); it is easy to note the demeaning portrayal of people in the cartoon. David Goldman ( observes that, despite the varied attempts at pushing hate on the Internet, these sites seem to have little impact in recruiting converts. But it seems obvious that hate sites can do things via the Internet that would be impossible in print or other "real-life" media. Susan Clayton warns: "Unlike years ago, [hate groups] now can target young people through online games and graphics, as well as through e-mail and chat groups. These are Aryan-washed changes in message-dissemination the Internet offers that should have been taken seriously" (1999: p. 8).

In conclusion, how does the impact of the Internet on racism differ from that of other media? First, the Internet and Web pages are customized to act as "pull" technologies; that is, one has to type in the terms related to information that does not reach the Web user unless specifically requested. To get through this pull mechanism, which stands between online marketers and Web surfers, the marketers push their wares/information through catchy banners and pop-up advertisements. They push racist messages to hit unsuspecting surfers. In this respect, the Web is distinctly different from other media and communication technologies. Second, hate online has few barriers to overcome. All you need to do is to type "hate," and it appears both as HateWatch and as "Hate Directory". Third, hate sites on the Internet express their messages quite blatantly. The advantage of anonymity on the Internet and the lack of censorship enable the purveyors of racism to present their messages in more revealing forms than they can offline. Offline media must use subtle or implied means in presenting similar messages, which cannot be anonymous. Fourth, the Internet allows one to defy social norms in a manner that is different from expressing one's dissent through other media, because the Internet operates in a relatively consequence-free environment. It does not impose any moral or ethical rules on its users. So far, the Internet has remained outside the ambit of laws that govern other information media. Thus the Web has grown into an unregulated force with a powerful impact on its users. End of article


About the Authors

Indhu Rajagopal is an Associate Professor in the Division of Social Science at York University in Toronto, Canada.

Nis Bojin is a research student in the Division of Social Science at York University in Toronto, Canada.



1. Although our discussion of rhetorical forms as influenced by the content and types of messages disseminated differ from Lynn Thiesmeyer's, her findings and evidence confirm and reinforce our arguments. Thiesmeyer's (1999) discusses hate rhetoric and richly elaborates on its various forms. See, Thiesmeyer for an in-depth analysis of the narratives and examples. She coins seven types of rhetoric used in Neo-Nazi literature on the Web. 1. Pedantism which is a type of Neo-Nazi discourse that preaches and used for recruitment, persuasion, and ideological indoctrination. 2. Urgency: The tone of Nazi discourse is characteristically alarmist and raises a specter of enemies to fight against. 3. The use of 'historicism', fake tradition, and folk etymology: Nazi tradition falsely claims that certain groups belong to an original, pure, and unmixed Aryan race as a historical fact. 4. De-legitimisation of other discourses is needed to make the racist groups feel that their fabricated or distorted traditions and myths are secure. This is important for the Nazis to create a sense of belonging to a virtual community of the like-minded. 5. Overt and privileged use of a 'collective subjectivism' ('virtual community' is being used to create a 'real' community). 6. Dual structure of the production side: "a. ideology-makers: politicians, strategists, commune leaders, militia leaders, Web site text and FAQ writers: more subtle, more value-neutral, less racially offensive or polarised discourse; their Web discourse is likely to be 'coded'. that is, more neutral-sounding with 'in-group' meanings beneath the surface b. actualisers/ actors/agents: skinheads, commune disciples, small-scale survivalist groups dependent on propaganda from outside to keep their enthusiasm and discipline: more overt, more evaluative, more racially offensive and polarised discourse; unlikely to be 'coded'." 7. Factualization: "the phrasing of a subjective perception, unknown, or non-agreed-on item in such a way that it seems to have been already established. Textual Web sites of every kind rely on this sort of rhetoric, as they are targeted towards persons who will browse quickly and presumably without time for, or interest in, any background debate that might have produced the so-called fact"; Thiesmeyer (1999), pp. 120-124.

2. Hate materials that are referred to here as 'actionable' are those artifacts or information that can be legally challenged, censored, and removed from the Internet. For instance, in 2000, a court in Paris ordered Yahoo Inc. to remove certain auction items identified as Nazi memorabilia from its Web sites based in the United States so that surfers of the site located in France would not be able to access such items that are illegal in France. Although these items on Yahoo! were protected under the United States Constitution, the French court ordered the censorship of the items, and imposed a daily fine of 100,000 francs for each day after a deadline for complying with its judgment. However, on hearing Yahoo's case, U.S. judge Jeremy Fogel ruled against the enforcement of the French court's order and fine in the United States. He ruled that the sale of the artifacts was protected under the freedoms guaranteed under the First Amendment of the U.S. Constitution. However, international law experts interpreted the situation as not a win for Yahoo. Michael Geist, a Internet law expert at the University of Ottawa noted: "The fact is that people have got to recognize the multi-value self-interest that each country has to enforce its own laws against Internet content that has significant negative effects" within the country's borders. Quoted in Kaplan (2001a); also see, B. Levin (2001-02).

3. Non-actionable hate messages are far more prevalent on the Web as the Internet is unregulated by law and unrestricted in its uses. Ray and Marsh (2001) point out the import of hate messages and how the victims need to be protected: "The Internet plays a crucial communications role in the glorification of the philosophy of revolution. Promotion of the Lone Wolf perspective allows these groups to avoid legal responsibility for actions committed by visitors to their Web sites, who may be inspired and incited to action. Direct marketing via the Internet supplements traditional sales. It allows white extremists to privately purchase items that may not be found in their local areas. Online sales assist in creating brand name awareness for each of the cyberspace white extremist organizations ... Hammerskin Nation functions as a gateway to information on cyber-terrorism ... Preventing these groups from collecting personal information from children and others by enforcing the Children's Online Privacy Protection Act of 1998 (COPPA) should be the priority of all monitoring groups, agencies, and parents. Application of the law to these sites is appropriate because most of them engage in commercial activity and because they appeal to a broad audience that includes children ... Society and its elected representatives must decide whether existing laws, particularly those regulating communications, are in need of revision." For the unregulated nature of hate messages, their impact, and the need for regulation, see also Berkovitz (1999); and Kallen (1998).

4. Some of the following writers and speakers have identified the messages and their impact. See, Howard P. Berkovitz (1999) and Ray and Marsh (2001) who elaborate on the various ingredients in hate messages that appear innocuous and educational. Ray and Marsh have a section in their paper that discusses the types of messages: Appeals to Fear, Jumping on the Bandwagon, Cartoon Characters, and Photographs Depicting Violence. The youth surfers on the Net are being persuaded to believe these messages as legitimate by certain Web ploys and message contents: Testimonials as Authority, Institutional Authority, Scientific Authority, Religious Authority, Deference to Documented Authority, References to Authority and the various Web tools that are used to communicate them.

5. Holdt presents the pictures on his Web site informing and 'educating' the surfers that poverty breeds racism, a totally unrelated topic (although he believes that poverty is a cause of racism). Lynn Thiesmeyer (1999) analyses the hate rhetoric "The seven features of online neo-Nazi rhetoric I analysed above are, of course, common to many media and existed before the Internet came into being. Pedantism, urgency, fake historicism, de-legitimisation of others, collective or community subjectivity, the split into ideology-makers and agents, and the peremptory factualization of debatable points, all can be found in other media and other forms of textuality; we, academics, are not free of them ourselves. The Web, however, due to the construction of its function and its users, prioritises these forms of expression, while at the same time arousing little opposition among ordinary users." Goldman (2002) talks about coded messages; On humour, see cartoon Web pages on hate sites full of images depicting racist humour. Howard Berkovitz (1999) identifies how hate is clothed in academic garb as educational materials: Hate Web sites "portray their anti-Semitism in a facially attractive, academic-looking manner, all too persuasive to naïve and ill-informed young people." David Duke, a former Klansman and leader of the National Association for the Advancement of White People, "promotes his supremacy theories under the deceptive rubric of "white rights"." These sites also use commercial rhetoric to sell their memorabilia. Ray and Marsh note some features of the hate rhetoric: information-sharing, selling goods (commercial), books for children (educational), simple and persuasive, plea, etc.

6. Ray and Marsh (2001) explain the Web features that uniquely transmit Hate messages and activities via the Web tools: file sharing, co-branding, electronic commerce, children's book, e-mail and usenet, infiltrating chat rooms, information and guidance for terrorists, the lone wolf mentality, encryption, and hacking. Howard P. Berkovitz (1999) speaks about the outcome of violence: Racists "are using the Internet to promote and recruit for their causes, to communicate easily, anonymously, and cheaply, to reach new audiences - particularly the young, to raise money for their activities, and to threaten and intimidate their enemies. As the tragic events in Littleton, Colorado recently showed, the Internet offers both propaganda and how-to manuals for those seeking to act out fantasies of intolerance and violence." Mass customization is a term used in the context of manufacturing, a term that describes individually customized products to meet the needs of every individual customer but in mass quantities. Electronic ordering and production have advanced production to the level of mass manufacturing of customized items. For a discussion of the Web's flexibility see, Economist (1999b). Non-regulation: For the reasons why the Web must be regulated, see Kallen (1998). Digitalization makes it easy to clip and paste, or download different media from the Web. Unexpected: A major objective of these sites is to recruit unwary surfers who might be tempted to buy into their message and purchase their memorabilia. Creating Web designs and Web banners that pop-up unexpectedly in a page unrelated to hate content, is a strong objective of hate Web designers.

7. On 26 October 2001, President Bush signed the U.S. Patriot Act (USAPA) into law. This law gives both domestic law enforcement and international intelligence agencies extensive powers while eliminating the powers of the courts that they previously had, to ensure against abuse of these powers. See

8. Dan L Burk is quoted in Kaplan (2002).

9. The United States has not ratified the international human rights instruments that protect the rights of all racial and ethnic group members against hate, slander and discrimination. Article 20.2 of the International Covenant on Civil and Political Rights (ICCPR) ensures racial and ethnic groups' freedom from defamation: "Any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence shall be prohibited by law" (OHCHR, 1976). Article 2.3 of the United Nations Declaration on the Elimination of All Forms of Racial Discrimination (OHCHR, 1963) speaks against racial and ethnic discrimination and promotes affirmative action. Article 4 of the International Convention on the Elimination of All Forms of Racial Discrimination (OHCHR, 1969) that was passed in pursuance of the declaration, enjoins the ratifying states to take legal action against dissemination of ideas based on racial superiority or incitement of violence against races and to declare organizations promoting such ideas illegal and to ban them (see also Kallen, 1998).



14 Words, 2001. At, accessed 30 January 2001.

88 Music, 2001. At, accessed 30 January 2001.

Anti Defamation League, 2002. At, accessed 30 January 2001.

Rohit Barot and John Bird, 2001. "Racialization: The Genealogy and critique of a concept," Ethnic and Racial Studies, volume 24, number 4, pp. 601-618.

Howard P. Berkowitz, 1999. "Statement of the Anti-Defamation League on hate on the Internet before the Senate Committee on Commerce, Science and Transportation," at, accessed 30 January 2002.

Shane Borrowman, 1999. "Critical surfing: Holocaust denial and credibility on the Web," College Teaching, volume 47, number 2 (Spring), pp. 44-47.

Brandenburg v. Ohio, 1969. At[group+395+u!2Es!2E+444!3A]!28[level+case+citation!3A]!7C[group+citemenu!3A]!29/doc/{@1}/hit_headings/words=4/hits_only?, accessed 15 August 2002.

Candidus Productions, 2001. At, accessed 30 January 2001.

Manual Castells, 2001. The Internet galaxy: Reflections on the Internet, business, and society. Oxford: Oxford University Press.

Susan L. Clayton, 1999. "Learning not to hate," Corrections Today, voume 61, number 5 (August), pp. 8-11.

Children's Online Privacy Protection Act (COPPA), 1998. At, accessed 30 January 2002.

Jenny Craven, 1998. "Extremism and the Internet," Journal of Information, Law and Technology, Issue 1 (February), at, accessed 30 January 2002.

Economist, 2001. "Web phobia," volume 358, issue 8214 (22 March), p. 99, at, accessed 30 January 2002.

Economist, 1999a. "Racism: Hatred unexplained," volume 352, issue 8127 (10 July), p. 25, and at, accessed 30 January 2002.

Economist, 1999b. "The Internet: Downloading hate," volume 353, issue 8145 (13 November), p. 30, at, accessed 30 January 2002.

Raymond A. Franklin (compiler), 2002. "The Hate Directory," at, accessed 30 January 2002.

Katharine Gelber, 2000. "Hate crimes: Public policy implications of the inclusion of gender," Australian Journal of Political Science, volume 35, number 2, pp. 275-289.

David Goldman, 2002. "Hatewatch," at, accessed 30 January 2002.

Hammerskins, 2001. At, accessed 6 February 2001.

Jacob Holdt, 2002. American Pictures, at, accessed 30 January 2002.

Evelyn Kallen, 1998. "Hate on the Net: A Question of rights/A Question of power," Electronic Journal of Sociology, volume 3, number 2, at, accessed 30 January 2002.

Carl S. Kaplan, 2002. "Divining the future of law and technology," at, accessed 30 January 2002.

Carl S. Kaplan, 2001. "Was the French ruling such a victory after all?" at, accessed 30 January 2002.

Elissa Lee and Laura Leets, 2002. "Persuasive storytelling by hate groups online: Examining its effects on adolescents," American Behavioral Scientist, volume 45, number 6, pp. 927-957.

Brian Levin, 2002. "Cyberhate: A Legal and historical analysis of extremists' use of computer networks," American Behavioral Scientist, volume 45, number 6, pp. 958-986.

Micetrap Distribution, 2001. At, accessed 30 January 2001.

Office of the High Commissioner for Human Rights (OHCHR), 1963. United Nations Declaration on the Elimination of All Forms of Racial Discrimination, Proclaimed by General Assembly resolution 1904 (XVIII) of 20 November 1963, at, accessed 27 August 2002.

Office of the High Commissioner for Human Rights (OHCHR), 1969. International Convention on the Elimination of All Forms of Racial Discrimination, Adopted and opened for signature and ratification by General Assembly resolution 2106 (XX) of 21 December 1965, entry into force 4 January 1969, in accordance with Article 19, at, accessed 27 August 2002.

Office of the High Commissioner for Human Rights (OHCHR), 1976. International Covenant on Civil and Political Rights, Adopted and opened for signature, ratification and accession by General Assembly resolution 2200A (XXI) of 16 December 1966, entry into force 23 March 1976, in accordance with Article 49, at, accessed 27 August 2002.

Lisa Nakamura and Gilbert Rodman, 2000. Race in cyberspace. New York: Routledge.

President's Working Group on Unlawful Conduct on the Internet, 2000. The Electronic frontier: The Challenge of unlawful conduct involving the use of the Internet. Washington, D.C.: U.S. Department of Justice, and at, accessed 28 January 2002, verified 29 September 2002.

Beverly Ray and George E. Marsh II, 2001. "Recruitment by extremist groups on the Internet," First Monday, volume 6, number 2 (February), at, accessed 15 August 2002.

Ulrich Sieber, 2001. "Fighting hate on the Internet," OECD Observer, at, accessed 29 September 2002.

Raymond W. Smith, 1999. "Civility without censorship: The Ethics of the Internet - Cyberhate," Vital Speeches of the Day, volume 65, number 7 (15 January), pp. 196-198.

Stormfront, 2001. At, accessed 30 January 2001.

Cass Sunstein, 2001. Princeton, N.J.: Princeton University Press.

Lynn Thiesmeyer, 1999. "Racism on the Web: Its Rhetoric and marketing," Ethics and Information Technology, volume 1, number 2, pp. 117-125.

Tom Tyler, 2002. "Is the Internet changing social life? It seems the more things change, the more things stay the same," Journal of Social Issues, volume 58, number 1, pp. 195-205.

Universal Declaration of Human Rights, 2002. At, accessed 15 August 2002.

Editorial history

Paper received 1 September 2002; accepted 19 September 2002.

Contents Index

Copyright ©2002, First Monday

Copyright ©2002, Indhu Rajagopal

Copyright ©2002, Nis Bojin

Digital Representation: Racism on the World Wide Web by Indhu Rajagopal with Nis Bojin
First Monday, volume 7, number 10 (October 2002),

A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2014.