First Monday

Trolls at the polls: What cyberharassment, online political activism, and baiting algorithms can show us about the rise and fall of Pakatan Harapan (May 2018-February 2020) by Clarissa Ai Ling Lee and Eric Kerr



Abstract
This article considers how politically motivated Internet trolling, within the context of Malaysia from May 2018 until February 2020, made use of affordances of algorithms and platforms to achieve their goals, from targeted attacks on individuals to collective interventions for advancing social and informational justice. Centering on the importance of digital platforms and algorithms in framing and shaping online communication, this article explores the decisions, actions, and policies which, framed and shaped by these algorithms, produced a particular space in Malaysian political discourse that enables Internet-based political trolls. Attention is given to the infrastructure of trolling, as well as the platforms supporting and cultivating the practice of trolling that are usually international in their ownership, development, and user base. By focusing on the trollish practices of a “minor” non-Western community in Asia, we attempt to theorize the effects of digital infrastructure at the periphery of multinational platforms based on participant-observation research and media-textual analysis.

Contents

1. Introduction
2. Trolling and algorithms of social media platforms
3. The context of the Malaysian Internet
4. Method and scope
5. Theory and findings
6. Concluding remarks

 


 

1. Introduction

National elections in southeast Asia, over the past decade have seen the co-opting of digital platforms into campaigning strategies, whereby political actors intervene in the circulation (potentially leading to the ‘viraling’) of news. Increased Internet penetration and digital platforms were game changers to the southeast Asian political scene as political allies and rivals discovered the immense possibilities and reaches of the Internet, and new forms of manipulation. That was indeed the case in Malaysian politics surrounding the events of the 14th General Election (GE14), where decisions, actions, and policies driving online interactions were framed and shaped by algorithmic interventions.

The importance of using and accounting for algorithms has risen as algorithms have become increasingly embedded within the digital infrastructures undergirding present day online interactions. As a result, the pattern recognition capability of the algorithms could be channeled into the creation of biases and amplification used by online trolls and political activists. At the same time, the rise of new patterns of online actions require new ethical frameworks and rules. Although cyberspace studies of the late twentieth century have anticipated what was to come, there was recognition that the seemingly immaterial effects of pranking or trollish behaviour in cyberspace could quickly get out of hand and create psychological harm (Dibbell, 1996). The potential for harm is multiplied in a networked world when the ubiquity of algorithms greases access to personal user information: the same algorithms could perform pattern recognition, data analytics, recommendations, and predictions. Further, some of these algorithms come in the form of bots that could simulate, to a limited extent, some normative human behaviour online.

In this article, we explore the means by which politically motivated Internet trolling made use of affordances of algorithms and platforms to achieve their goals, and how such trolling activities were present in charting the rise and fall of the Pakatan Harapan government which governed Malaysia from May 2018 to February 2020. We analyze how decisions, actions, and policies, framed by algorithmic digital infrastructure produced a particular space in Malaysian political discourse; thereby enabling Internet-based political trolls and spreading a culture of political trolling that renders necessary the reconsideration of existing Internet ethics. While it appears that digital platforms have made political discourse more transparent by seemingly breaking down communication barriers between the politicos and members of the public, the flipside has seen an increase in the politics of distraction and disinformation. We consider the infrastructure of trolling and the role of platforms that support and cultivate the practice of trolling and activism with significant national political impacts, even though the platforms are international in ownership, development, and user base, albeit with local instantiations. Such an approach could provide a starting point for locating convergence between trolling practices observed in the Malaysian political Internet and forms of political trolling elsewhere. In the Philippines, “troll armies” pushed for the election of Rodrigo Duterte in 2016 (Ong and Cabanes, 2018). In Thailand, trolls targeted civil society groups with international connections by accusing them of being Western agents and traitors (Sombatpoonsiri, 2018). We will consider how the study of Internet political trolling in Malaysia contributes to understanding the relationship between digital platform users, amplification, and algorithms in the context of a post-colonial Asian society when it comes to reproducing, and even mirroring, the injustices one might observe in Internet spaces in relation to issues such as racism.

We begin section two with an introduction to trolling and the interaction between this phenomenon and the algorithms of social media platforms in the Malaysian context. Section three provides the context for our research which focuses on political trolling during Malaysia’s 14th General Election (GE14) and the life of the resulting government. We examine how minority Internet communities within the broader Internet community navigate particular and universal Internet cultures that include discourse on race and justice. The section also provides the background to the theoretical intervention that this article intends to make. Section four considers the methods and scope of the research, including the constraints faced by the authors. Section 5 provides our findings including selected trollish examples which we discuss. Section 6 provides our conclusions.

 

++++++++++

2. Trolling and algorithms of social media platforms

2.1. The trolls

The troll can be considered as an agent who elicits reactions from targets and bystanders through techniques of baiting. The consequence is often represented negatively (Kerr and Lee, 2019). Much of what we refer to as “Internet trolling” — baiting, pranking, hoaxing, harassment — pre-existed the digital era. While Internet trolls might use digital technology to expand the perimeters of their pranks, trolling worked quite effectively with analogue technologies [1]. Twenty-first century pranksters have learned to interpret current discourses on forecasted technologies, disruptive technologies, and technological utopia/dystopia through hoaxes centred on exploiting the expectations and credulity of their targets. Some examples are rickrolling, crank calling, and candid camera TV shows (or at this age, Tiktok type prank videos). Such actions are irreducibly public and emotionally manipulative. A troll, who could be a prankster or not, relies on an assumed participating audience or, in many cases, an unknown number of ‘lurkers’ who observe, read, and internalize the trol’s message. However, within the political context, the more common practice would be in the form of satire or parody. However, we are not so interested here in arguing the definitive parameters of the troll, but rather, to investigate how trolling as an act of baiting, gaslighting, pranking, hoaxing, and related provocations are performed in Internet-based political interactions [2].

Trolling culture involves not just the explicit actions of trolls but the creation and maintenance of forms of online discourse that the troll may not actively participate in, but which have been incited and promoted by trollish actions and can culminate in mob mentalities. A recent report points to the pernicious harm that algorithm-dependent artificial intelligence could have on the global south where the voice of the latter are erased, past colonial structures are further reified digitally, and biases and discriminatory practices in the real world become increasingly automated (Crawford, et al., 2019). However, the report positions the notion of harm through the ethical standpoints within the work of scholars based at Western institutions. Digital infrastructures that maintain algorithms powering digital platforms are infused with a material history of the Cold War that produce new forms of informational and technological imperialism implicating and impacting the developing world, including in Malaysia; Aouragh and Chakravartty (2016), for example, note that digital infrastructures are marked by colonial encounters full of “uneven and fractured capitalism” that reproduces cultural imperialism through monopoly over transboundary data flows [3]. We consider the form that harms and injustices take and consider how the hierarchy of dominance is not merely dictated by race, but by who has the most power to set the stakes within the community.

The forms of trolling we identified, covering the period prior to the GE14, that brought about the instatement of the Pakatan Harapan government, before it broke apart less than two years later, could be generalized as such:

1. Personal attacks

Personal attacks involve leaving negative and insulting comments on the social media pages of targets, usually anonymously, or through pseudonymous handles gesturing to allusive meanings. While all political figures are considered fair game, it is usually the more prominent or unpopular among them that are bombarded. Even posters who saw themselves as calling out injustices are not above gender trolling, for example, as one could see from countless examples of memes that were meant to react against the actions of Malaysia’s former First Lady, Rosmah Mansor.

 

Malaysia's former First Lady
 
Figure 1: The final panel refers to Malaysia’s former First Lady, with all of her perceived excesses (usually represented as feminine excesses) and physique exaggerated to grotesque levels. This level of personal animosity was never directed at her husband, former Prime Minister Najib Razak, despite his relative position of power.
 

 

Queer sexual acts became terms of political insult, particularly among the Malay-speaking community. For instance, ‘main belakang’ is a double entendre that includes sodomy, which has also been used in its more explicit form, ‘semburit.’ The strong flavour of homophobia in Malaysia’s political community means that homophobic slurs (that usually references sexual acts) are deployed as the highest form of insult. Such insults could be seen throughout social media and are used to troll the hated target.

2. The creation of trollish critique spaces

Setting up a social media space with the intent to facilitate trolling as a serious form of critique. This usually comes in the form of Facebook pages or Web-based news. In the case of Malaysia, two such examples are the more militant Tentera Troll Kebangsaan Malaysia (National Troll Army, Facebook page https://www.facebook.com/TTKMRELOADED/) and Malaysias version of The Onion, known as The Tapir Times (http://thetapirtimes.com/). Both are discussed in the fourth section of the article.

3. Moralizing messages

Controversial, moralizing posts with the intent of shaping opinions around targets. The provocative nature of the post is further inflamed by anonymous and pseudonymous comments as participants of digital mob attacks. Ironically, one could see the most famous instance of such a performance at the social media pages of the aforementioned former premier, Najib Razak, who is now dubbed the King of Trolls (Ibrahim, 2019; Malaysiakini, 2019). Note 13 provides another example of a moralizing mob attack through the vicious gender-trolling of a female anchor, although this preceded the period of the Pakatan Harapan government.

4. Meta-trolling through videos

Meta-trolling in videos by content creators (who often perform the role of micro-influencers) to critique popular political issues. This was particularly popular during the lead up to the general elections. In Malaysia, the particularly homophobic nature of its political culture means that politicians (usually male, Malay and Muslim) are targets for entrapment as sex videos featuring them in homosexual acts are viraled. In this case, there is a gray area between doxing and slandering, since the targets would often claim that such videos had been faked. The most recent example involves a sex tape allegedly depicting homosexual acts between a Cabinet Minister and another man.

5. Meme disruptions

Posting or reposting of memes at various forums with the intent of disrupting conversations or inflaming emotions; or making subtle yet evident changes in information/knowledge spaces with the intent of insulting or poking fun at a target. Many of these memes were circulated among private Whatsapp chat groups, although one could also find them on Facebook and Twitter (Sholihyn, 2020). Between 23 February and 24 February 2020, the fall of the then ruling coalition, Pakatan Harapan, led to anxieties that could only be ameliorated by the creation of new hashtags and memes [4]. However, memes were not merely to express emotions; memes as a form of political activism involving Malaysians were also found in the aftermath of the anti-Muslim terrorist attack in New Zealand. The attack itself can be seen as a troll deciding to turn his online hate-speech into reality through his deadly armed attacks on two mosques during Friday prayers. In the middle of a press conference, a right-wing Australian politician, who had been expressing similar anti-Muslim hate to the attacker, had a raw egg cracked open on his head by a teenager who was dubbed Eggboy in social media. Online discussion of the incident on the politician’s Facebook page, including efforts to criticize the boy, were disrupted by the Malaysian troll army, who called themselves the ‘Bawang army,’ (Onion army — ‘bawang’ also means gossip in colloquial Malay) who posted memes of eggs disrupting the hostility and supporting Eggboy, to drive home the message that hate and fascism are not acceptable [5].

As mentioned, the acts of the trolls take advantage of the algorithms of social media platforms to enhance their effectiveness.

2.2. The Malaysian political context

In our study of Internet political trolling in Malaysia, we intend to demonstrate elements of algorithmic manipulation and oppression which fall through the cracks of Western perspectives on critical race discourse. However, the difference has less to do with Malaysia having its own set of values than a failure to dissolve a problematic practice that entangled race and political survival, therefore giving politicians no incentive to disengage from a practice from which they had benefited for decades. For voting Malaysians, it became the choice of a lesser evil although the choice was restricted when the parties they support may contain individuals that are unappetizing, or who lack the political will to change the status quo for fear that it would jeopardize their political career. That status quo includes alliances with groups perceived to be racist or allegedly racist members of their political party and coalition. The failure to ratify the International Convention on the Elimination of All Forms of Racial Discrimination (ICERD) by the Pakatan Harapan government (Saat, 2018) continued the institutionalization of racism without any form of legislation against such behaviour. Harmful racist behaviors aimed at soliciting political capital were allowed to proliferate [6]. If there is anything we could learn from this, it is perhaps that, where institutions are inseparable from powerful or influential individuals, and where said individuals may use race as a political weapon, racism can become institutionalized. Moreover, mature democracies have institutional instruments for seeking justice, which is not always the case for emerging democracies that represent most of postcolonial societies. Endemic racism produces cultural and racial silos, which are reproduced in the cultural products developed and consumed by members of said silos.

It is crucial to remember that the infrastructure of trolling in the Malaysian political context reflects the transition still ongoing in Malaysia as it moves away from an increased policing of the digital space by scrapping the Anti-Fake News Act (AFNA) (Smith, 2019) with the intent at expanding the space for freedom of expression. Even if the Malaysians’ access to online platforms are unfettered, the platforms are not accessed in the same way, nor do these platforms contain the same meaning for those accessing them due to differences in Internet literacies. The differences in these literacies also determine how trolls are received and responded to, and how trolling is perpetuated. While parallels can be drawn with trends in other countries, such parallels often highlight the distinctiveness in the Malaysian context.

Understanding the racialized and racist preoccupations of Malaysian politics requires a short detour into its history. Malaysia attained the status of a sovereign nation (with the name Malaya) on 31 August 1957. It took the name Malaysia on 16 September 1963 (together with Singapore, which then left in 1965 to form its own state). The country runs on a bicameral parliamentary system and has a written constitution within which the protection of Malay rights is enshrined. Despite early attempts to establish ideologically-based parties, the default mode of operation returned to racial identities, shaped by difficulties in cognitively separating the racial profiles of a party’s membership and leadership from their political and ideological beliefs or religious views (e.g., one party was established on the premise of a particular interpretation of Islam). This was further reified by the establishment of the first ruling coalition represented by three major race-based parties set up under the pretext of ensuring the wellbeing of the major races of Peninsular Malaysia (the Malays, Chinese and Indians). This coalition, that would become known as Barisan Nasional (National Front), was in power for over 60 years before it was toppled in the GE14 by Pakatan Harapan that took over in 8 May 2018. Prior to that, the last time an opposition coalition had won the elections was on 10 May 1969. That event had led to violence and the ensuing declaration of a National Emergency that allowed the defeated incumbent to return into power [7].

This background impacts the nature of politics in Malaysia, which in turn interacts with the general policies, rules, systems, and algorithms of large international social media platforms which are used as tools in the political process.

2.3. Algorithms of social media platforms

It is unsurprising that non-Western communities considering digital platforms as liberating spaces for advancing their own system of belief could contribute to the advancement of a belief system that is incongruent with Western standards of justice, ethics, and equality. Moreover, as the cases we examine here will demonstrate, algorithms differentiate between various localized Internet communities, and the peculiar preoccupations of these different groups, indicating the non-neutrality of the dependent platforms. Algorithms, at their technical core, are purely mathematical constructs when decontextualized from their application. Within this context, algorithms could be viewed as socially neutral. However, the seemingly ‘neutral’ character of the algorithms means that they can be bent to serve the needs of builders (capital, technocratic states, and technopreneurs), regardless of possible human collateral damage (Noble, 2018; Cheney-Lippold, 2017; O’Neil, 2016). This stems from a need to establish patterns of biases and preference clustering that allow spikes and trends to be detected given that these data are necessary for attaining the economic goals of these platforms in terms of targeted advertising. The elasticity of these platforms also lend themselves to security risks, one in the form of the invasion of privacy (by revealing your online activities to your social media contacts, or greater networks that you might not be directly connected to) and surveillance from third parties (which could also include the government). In the case of the latter, it was the ability of the incumbent government, prior to Pakatan Harapan, to marshal the media and social media to their advantage that enabled them to control the perception of the Malaysian voters for a long time until over-confidence led to their downfall during the GE14 (Nadzri, 2018; Lemière, 2018). Interestingly, even before the 2016 American Presidential elections where fake news and algorithmic manipulation by troll farms became the news staple, this issue had been raised four years earlier, in Oboler, et al. (2012), who stated that “Google could for instance significantly influence an election by predicting messages that would engage an individual voter (positively or negatively) and then filtering content to influence that user’s vote.”

In other words, the elastic characteristic of algorithmic neutrality that powers platforms makes it optimal for what is referred to as “influence operation”, which utilizes ‘cracks’ found within the target populace due to certain sociopolitical, cultural or economic crises to establish bots and propagandists all over social media platforms (Schneier, 2019). These trolls, bots or humans would then use the amplification and trend propagation affordances of the platforms to distract from the real issues that would break up the coalition while drawing attention to more petty and inconsequential squabbles through the deployment of social media’s form of populism. Pakatan Harapan’s time in government saw the heavy deployment of social media to launch attacks and counterattacks, both inside and outside the coalition. In fact, this period saw the strong deployment of opinion influencing by those seeking to cultivate relationships with particular demographics of the social media, to gain the trust and confidence of these particular demographics so that they could be mobilized for providing the necessary cross-platform amplification when the right ‘bomb’ is dropped. The use of such influencers, albeit those who command following and are adept at stirring emotions without revealing their hand, present methods of amplification that are not reliant on attribution of sources. This enables the creation of alternative narratives that could, after considerable amplification through various platforms, be taken for the truth by a majority of social media users who often lack the digital literacy to distinguish among the misinformation out there. In other words, neutrality in itself is not a positive feature since it could be easily manipulated by those who are savvy and have the ability to hack the situation at a social and technical level, which could, in the end, bring about new forms of exclusivity and injustice.

 

++++++++++

3. The context of the Malaysian Internet

Our case studies on trolls attend to political trolling, deployed as forms of critique, disruption, protest, and intimidation among Malay- and English-speaking social media communities. We consider how events surrounding the GE14 made space for different forms of trolling, with some recognizable in contemporary theorization about trolls, while others represent distinctive and not yet well-theorized elements.

Lim (2018) suggests that the “political regimes of southeast Asia contain a rich mix of political structures, cultural systems, depth of political engagement and histories” that exist outside of the neat socio-political and technological structures for assessing parallel situations in Western Europe and America [8]. At the same time, the region has different levels of sophistication when it comes to understanding the implication of digital infrastructures on issues of privacy and degrees of connectivity; the result of uneven digital literacy and Internet access across the region. Of course, the socio-cultural heterogeneity of the region contributed in no small way to shaping how interactions are carried out through digital access. Attempts at navigating this new space gave rise to questions of identity that flare up in ways not so evident in offline contexts.

Identity-based activism also brings into sharp focus, gender-directed activism responding to the perception that the new government has not taken steps to improve toxic gender stratification and the persistent gender equality gap, both digitally and in the real world. While we see gender-trolling through attacks on unpopular political women at a personal level; such as the venomous attacks against the person of the sixth Prime Minister’s wife, Rosmah Mansor, that also included insulting references to her appearances [9], we find that attacks or negative comments on the physical or gendered attributes of the non-heteronormative (Malay) male target are most prominent in Malay social media trolling in Malaysia social media, especially in relation to religion [10]. In other instances, there are trollish references to Teresa Kok, former Minister of Primary Industries under the Pakatan Harapan government, who was depicted as a palm oil/commodity hustler in various meme-like visuals that leveraged on her gender. Nevertheless, gender-trolling is most explicitly performed in Malaysia’s Women’s Day March, an annual event in Malaysia, as a means of discrediting the target rather than merely invoking hostility. The tone taken aims at provoking and humiliating the intended target through the use of sexualized language, in a manner reminiscent of incels raging over the lack of male supremacy.

A study on trolling can illuminate the sociotechnical dynamics in operation within a digital infrastructure that sees a rapid proliferation of social media applications, and the accompanying risks they could pose. The troll is a symptom and a stimulant of information flows online. They both challenge the boundary of niceties and protect the boundaries of access and information on the platforms they operate on (Kerr and Lee, 2019). Trolling is weaponized in digital activism through ways that blur the personal with the political/public in order to support a narrative of change.

Before unpacking the role of algorithms in trolling practices, we begin by considering how trolling has played a part in Malaysia’s cyber activism on both sides of the political divide. Malaysia’s netizens took to the less-regulated cyberspace at the turn of the twenty-first century to regain access to a freedom of expression constrained and restricted by institutional structures in the real world (Gainous, et al., 2015). But what is different about the online self-organizing groups is that they were not developed to become deliberative units in the same way that organizations based in the real-world are. This is because the emergence of such groups tends to be more spontaneous and the result of responding to issues or controversies. Therefore, online groups could move between two ends of a spectrum; to become a mob or a strategic unit. The rise of technologies for collectivized visualization (such as memes for instance), is conducive to the creation of loosely connected groups. The memes could also become group avatars that embody the performance of the “simulacrum of the self.”

Gazing at the world through the eyes of a singular avatar is whence a collective of individuals could come together as one, as exemplified by the group Anonymous. Noveck (2005) has discussed at length the development of online groups during the early part of the twenty-first century when social media in the form we recognize today was not as ubiquitous, and when the idea of creating online groups for political activism were still in their infancy. She recognizes how technologies that emphasized visuality and seeing have enabled the multivariate representations of information. Her focus on virtual worlds that she discussed, such as Second Life, which are no longer as popular, reveals the extent to which social media platforms are propagating and thriving. This is particularly pertinent as the increasing visuality of these Web 2.0 digital spaces were meant to allow for new modes of production of a social order where the physical world is no longer separated from cyberspace. Another difference is that activities in cyberspace have increasing implications in the physical world even when one hides behind the mask of an avatar, and this is particularly the case with political trolling which could translate to misinformation meant to sway voters or orchestrate a political coup. She also brings up some compelling arguments about building a legal space for the legitimacy and protection of the democratic rights of these groups, neither of which are in place today. What we have instead are laws that are antithetical to fostering the spirit of responsible participatory culture.

In theorizing trolling, and its supportive platforms, through the particularities of geographic and cultural spaces, we intend to depart from the Anglocentrism of contemporary Internet studies (Goggin and McLelland, 2009) and contribute to diversifying existing theories on the relationship between digital infrastructures (as powered by algorithms) and trolls. Changes within the global political economy resulted in more attention to emerging markets, including Malaysia, by technological firms with profits generated out of mediating the experiences of new adopters. Increases in digital activities coincided with the expansion of trollish activities that could be read as pushback, protest, or rejection of the trolled targets within a shared space. Trolling becomes an act of reclaiming that space, ironically, at the expense of having a more diverse representation in cyberspace. In Barlow’s (1996) “A declaration of the independence of cyberspace,” he painted an idea of creating a non-prejudicial world that is not subject to the tyrannies of “race, economic power, military force, or station of birth,” an ideal of the online world that does not occur in practice. A good example of the online world not living up to this promise was the #Gamergate controversy where certain white, male, gamers took to trolling Zoe Quinn, a female game developer, to such extremes that misogynism and rampant white-male privilege that persisted in the game industry were revealed. As Graham (2019) argues, the increased diversity of Internet users does not preclude attempts by certain netizens to create exclusivity through methods aimed at alienating others. However, in Malaysia, this exclusivity is not necessarily typified by the configuration of race and power as operational in the Western democracies.

Readings on algorithm-driven racial prejudices operational in Western democracies could not explain the situational differences (including the connotation of race, racism, and racialization) in spaces where multi-cultural and multi-racial non-Western communities deploy platforms to advance their own political and social agenda. Those concerns may include what they see as justified racialized rhetoric and support for systemic racism, accompanied by the promotion of racialized policies which would not be unlawful in Malaysia. Moreover, as multinational media companies practicing vertical and horizontal integration to keep pace with emerging media technologies, together with social media technology firms (Facebook, for instance), begin to establish offices in southeast Asia, one could surmise that the region is of growing significance to these tech firms interested in knowing how societies in this region consume online content/virtual products. That, in turn, influences the production of content for catering to a perceived market, including the broadening of complexities in dealing with the value system of these communities. The blackboxing of Facebook’s system means that our observation could only be extrapolated from the effects perceived through interactions observed above ground. The difficulties in ensuring that the terms of service that govern the production of Internet content would adhere to universally shared values stem from different Internet cultures arising from real-world practices of different societies, bringing about a concomitant subjective interpretation of what represents acceptable online behavior and content. Therefore, for ‘minority’ languages such as Malay, Facebook is reliant on moderators who could read into the innuendo and nuances of the language. The same moderators would then apply their own values to the moderation of content that passes through the platform.

As Malaysia contains a plurality of languages and cultural practices producing its own unique argot, the precarious identity that members of different cultural groups have about their social milieu could lead to instances of social discord and mutual offenses such examples are found in social media spaces when it was perceived that the ethos of race and religion, subscribed to by some groups, are either challenged or diminished. However, political trolling is not only limited to sites dedicated to political or socio-economic discussions; cult personalities or brands in Malaysia also become targets for political commentaries since these are perceived as having a far-reaching impact on Malaysia’s national cultures and lifestyles. Although English-speaking Malaysian social media users are the most visible to international communities, they are only a tip of the iceberg as there are non-English vernacular social media communities that overlap with the English-speaking groups. In Malaysia, the Malay-speaking group is the most visible for demographic reasons and are also the most active when it comes to engaging in trollish behaviours across multiple platforms such as Facebook, Twitter, and YouTube [11]. These communities create their own silos and echo chambers, although one could still find overlaps between the Malay and English-speaking social-media communities.

In Malaysia, it is not unusual for the personal to be embedded in the political, since personal identities had always been used as bait for stirring up dissatisfaction. At the same time, prior to the GE14, the then incumbent government, Barisan Nasional, appeared to be sufficiently threatened to impose a practice that Roberts (2018) referred to as friction, namely, using content filtering and bandwidth-throttling, as well as blocking access to certain platforms that refuse to comply with demands to remove content; algorithms are used to produce these frictions even as they could also be used to reduce the friction through the process of masking. One could consider friction as a more passive approach to information filtering and indirect censorship by making access to the information less convenient and more costly to gain access to for most users. One must remember that the conditions through which the public obtain access to the Internet also shapes the socio-technical environment of the Internet. In other words, those occupying hierarchies of dominance and the dominated (the ones in control and the ones being controlled, although such relationships are rarely linear) are that which produce the recognizable structures of the Internet today. Frischmann (2012) characterizes the Internet as an “impure” public good since it combines public and private elements. Given the speed with which digital trends and information are developing, the gap between the digital haves and have-nots will only widen if space could not be made for developing the technologies that would allow for alternatives in experiencing the Internet, including the development of platforms that do not further privilege the already privileged.

The kind of inequality represented here are socio-economic, physical, and technical hurdles, be they intentionally imposed or not, by those with ‘privileged’ access, on those who either lack direct access to the technical infrastructures, or technical competency, for making that access available to themselves. Although the early years of digital activism were centered more directly on acquiring the right for community advocacy and political expressivity, in recent years, digital activism has extended to sociotechnical rights within Internet infrastructures as well as increased awareness of structural inequities within the digital spaces that become barriers to the democracy and justice sought [12].

That said, there is insufficient statistical data on digital activism in southeast Asia at present; however, whatever data available will still be useful for mapping the state of the Internet and the social media regions of the world (Lim, 2018). Friction is used most effectively with flooding to distract less informed users from matters at hand. For instance, cyber-trooping (the use of people from the government or a political party to manipulate public opinion online) and astroturfing (the creation of supposed neutral sites by partisans) can be seen as acts of “flooding” (limiting people’s access to real information by making it harder to find amid the misinformation), although they could also be seen as well-designed disinformation efforts intended to influence perception (Linvill and Warren, 2019). Flooding is most effective on platforms that operate on timelines (such as Twitter and Facebook) since bots could insert themselves and blend with the flow of texts produced by human users on social media.

There are existing studies (Kasmani, 2019; Chinnasamy and Norain Abdul Manaf, 2018; Nadzri, 2018; Leong, 2019; Abdullah and Anuar, 2018) related to the recent GE14 that had provided analyses on the event and its development, while also providing background on the hows and whys of social media usage during the elections, including the social media circumstances preceding that election (Hopkins, 2014). Additionally, there are already a number of studies focusing on the role of the platform in amplifying negative emotions through its ability to accumulate and circulate negativity (Coles and West, 2016; Phillips, 2018), though not so much about the connection between personal affordances (Lo Presti, 2020) and affordances of the space in enabling the flow of negative emotions.

While, there are considerations into how paid trolls are used in cyber trooping/astroturfing to manipulate political discussions and sow discord, and their psychological and affective impact upon the body politic (Johns and Cheong, 2019), there is not yet consideration into how the design of platforms are structured to allow for ‘intimate’ interactions even within a front-facing ‘public’ digital space. This form of intimacy arises from convenience in sharing information, since the imagined virtual audience allows one to presume the possibility of fellowship and communication without the intimidation of face-to-face interaction. However, that intimacy, and even frequent online contact, does not suggest participants in these spaces know each other beyond the presentation of respective online personas; even the appearance of vulnerability through what is shared is a performance. Therefore, in addressing all of the abovementioned gap in relation to how trolling, critical algorithms, or platform studies converge to produce sociotechnical effects, we aspire to change the present dynamics in the existing body of literature.

 

++++++++++

4. Method and scope

The decision to focus on Malaysia Internet-based trolling was the result of one of the authors observations, back in 2018, of the interesting dynamics found in the digital interactions taking place among a new generation of politicians making increased use of social media, as did their supporters and foes. 2018 marked an exciting time for Malaysia, not only in terms of a political regime change, but also because it represented the peak of digital activism. The activism went from low-key to intensive political activities that included voter education, voting rights, and debates over whether to vote, or not. The lead author was part of a private WhatsApp group that became momentarily political in the leadup to Elections Day, as a number of its members who were not associated with any political party, became volunteer campaigners for their preferred candidates (mainly from parties aligned with the Pakatan Harapan Coalition) and acted as observers at polling centres to ensure that no cheating would occur.

The lead author became a participant-observer in the social-media space by observing and commenting in these spaces, amplifying some of the messages of interest, and discussing all the various memes and critical postings that had emerged in the aftermath of the elections. This article is not focusing specifically on the author’s experience as a participatory observer during the period leading to and after the GE14, but rather, on the outcomes of that participation which then led to the selection of cases for consideration here. The basis of the case selection was the extent to which they were illustrative of how the political events taking place during that period had produced a chain effect leading to the production of outcomes of interest. The development of these outcomes within certain platforms demonstrated how algorithms present in the platforms could amplify and make viral the interactions represented in the selected cases.

The limitation of this approach is that the authors would only be able to work with the better-known cases; cases that illustrate the failure of algorithms to recommend and amplify are not as easily accessed since they would not be as easily tracked within the platforms. Despite the strength of algorithmic amplification, the format of interactions the authors are confronted by is also determined by the immediate interactions the authors were exposed to within their own social media networks.

The case examples we will be considering could be found on Facebook, the most popular social media site in Malaysia, having 81 percent of market share, and Twitter, which, while less popular than Facebook, with less than eight percent of market share, has developed increasing traction as political figures apparently use both platforms to reach out to their audiences (Statcounter, 2020). This study complements existing studies on how mobile devices, and mobile-centric social media apps such as WhatsApp, have facilitated Malaysia’s socio-political consciousness, particularly among mobile device users who may not be active in other more public social media platforms. However, platforms such as WhatsApp create a preponderance of private communications that produces an algorithmic echo chamber that merely recirculates and reiterates arguments and information, without validating the sources that passed through them (Lee, 2018; Tapsell, 2018; Johns, 2020). We also consider the examples of two news-sites that take on the tenor of trolling through the use of different forms of provocation in their reportage.

 

++++++++++

5. Theory and findings

In the process of determining the parameters for assessing Internet political trolling, the authors found that ingratiating oneself into political discourse through trolling does not require a great deal of sophistication or critical reflection. Since some of those involved hide under the cloak of anonymity, it is not always possible to hold all parties accountable. Ad hominem responses are opportunistic replies. However, what should be considered is what happens after these responses; whether further actions are taken, other strategies of debate deployed, or merely the recycling of a similar modus operandi upon the target of the week, quickly forgetting the outrage of the previous? Perhaps users could move in and out of platforms, as well as in and out of debates, to achieve their aim. The cases in this article illustrate the performance of authentic selves across the different silos of social media communities in Malaysia, especially the Malay social media community constituting the more vociferous among the vernacular silos.

In 2018 Malaysia, there were plenty of recognizable political memes subject to creative interpretation and interpolation by those with wit and appreciation for irony, as well as the ability to manipulate digital tools to produce these memes; memes were also the easiest to amplify algorithmically. More than a decade ago, opposition politicians turned to social media due to difficulties in accessing Malaysian television and radio networks (Leong, 2019). In the GE14, all political parties with the necessary resources were invested in doing what was required to capture the attention economy. Due to the convenience of simultaneous sharing across platforms, an act that amplifies and complements, aggressive campaigning increased in intensity, rendering an assumed neutral space into one saturated by individuals from all political shades.

Within these widely circulated messages are trollish endeavors, notably concern trolls (Petri, 2014) who produce and circulate certain memes, or create and contribute to postings, to express insincere worries and anxieties. As the authors have argued in the first section, there is not always a clear distinction between a troll and an opportunistic bystander mob. The same applies to that of an overly earnest poster who does not realize that s(he) is circulating questionable material, especially during an emotional period when the circulation and sharing of materials become a form of shared remonstration and/or are aimed at amplifying existing emotions. A number maintain pseudonymous handles even as they discuss matters pertinent to their personal lives. While some might be “popular” for themselves (possibly considered as ‘influencers’), there are those who are “popular” despite their facelessness because of what they could then represent to their followers and those engaging with them.

Important in all of this is how fellowship, intimacy, and facelessness are features of underlying systems of pseudonymity and anonymity produced by the platform infrastructure. For instance, owners of a broadsheet newspaper, who allow comments to be posted below the main content, might decide that the Web site no longer requires users to sign up with their real names. This means that users may choose, if they wish, to be anonymous when posting to the site and, to an extent, protect their identity. The owners may have taken this decision to allow vulnerable groups the necessary protection to post their opinions freely without fear of retaliation. Users who do not hide their true identities might experience prejudice on the basis of their gender, race, ethnicity, class or country of birth; allowing anonymity removes the basis for such prejudices. As White (2013) notes, people do things they otherwise would not when they know, or think, that they are anonymous. The time-based media of Internet commenting further gives many the sense that they can “quickly dash off a negative comment” and escape.

Exploits in systems mean that anonymity is not foolproof. Markers of all sorts may fully or partially identify a user or characteristics of a user. Handles, avatars, and biographical information may reveal a users ethnicity, location, age, gender, religion, and so on. Posting histories, hyperlinks, and other markers enabled by the platform as well as meta-data associated with a user’s profile may give away more precise information while still be more or less opaque to said user. At the same time, the origins of the shared content may be obscure. Automated bots copy-paste text from other sources and mask their identities through channels of obfuscation, garnishing their profiles with the appearance of an authentic user. Such relations highlight the interconnectedness of sites and platforms, while at the same time being so vast that it may be practically impossible to trace the source of any particular textual passage. Decisions taken by individual owners have repercussions for other sites not just as examples to be copied but as nodes within these interconnected networks.

These nodes are controlled through algorithms that underpin the interactions between trolls, their targets, and bystanders on digital platforms by spreading the performance and effects of these interactions beyond their immediate networks; algorithms could also ‘promote’ the voice of an anonymous user. Algorithms bias and refract certain interactions using recommender systems based on how the users choose to interact with the systems (and whether that interaction is passive as seen through the patterns of actions by the users; or active as a result of user interrogation).

There are two instances whereby the algorithms would allow the flourishing of a variety of trollish practices across different cultural groups. In the first instance, localized socio-cultural amplification is performed on ubiquitous social media platforms where their universal features are appropriated and recontextualized within localized Internet culture. The second instance considers the potential difficulties of enacting standards of privacy and ethical behavior in digital platforms already crowded by a heterogeneous community of users with various levels of sophistication and who come from various social classes, but bring with them, cultural practices that have their origins in the off-line practice.

In recent years, the increased sophistication of AI means that the trolls can include non-humans, therefore, disrupting conversations on intentionality, liabilities, and complicities often applied to identifiable human actors, be they trolls or micro-celebrities (Marwick, 2019). There are assumptions regarding what constitutes a good match between ‘hits’ generation and popularity rankings. This assumption informs the construction of algorithms for promoting viral content, and the marking of outlier and irregular behavior irrelevant, with both possibly the result of actions with trollish motivations.

Big tech subscribes to the fallacy that algorithms are neutral through the argument that an algorithm merely follows data breadcrumbs and could only function with whatever data that it has been fed. For instance, Facebook has claimed that its distribution of content is neutral because they are automated through algorithms (Osofsky, 2016). However, not only has research in the philosophical and social studies of technology shown that algorithmic processes in general are morally, politically, and epistemically-laden, more recent work that inspects algorithms and recommender systems (Facebook’s included) has revealed its built-in, morally-laden values (Cameron and Martinez, n.d.; Milano, et al., 2020).

While one might not know all of the mechanisms employed in the development of Facebook’s algorithms, the deployment of the recommender system provides observers with an understanding of the filtering bias used to identify and present its target-user with choices that an algorithm considers as relevant to the user. The experience is further influenced by the insertion of “boosted” content, introducing real world power dynamics. Moreover, one could acquire a broader understanding of the biases in the algorithms by analysing the socio-technical engineering represented within the pages of the social media platform (Mitchell, et al., 2013; Meyer, 2016).

In an arbitrary sense, the data-mining capability of algorithms on the platforms adapts to majority biases in the user population, and therefore, reinforces and shapes those biases through a series of feedback loops (Murphy, 2016). For instance, algorithms such as those found in Facebook’s trending news and user newsfeeds have resulted in either the provision of information that reinforces the existing political views of the user — creating an ‘echo chamber’ — or provoking an antagonistic reaction — producing ‘clickbait’. The algorithms are, to an extent, black boxes whose efficacy and desirability are judged by the number of clicks they draw to their targets or the number of reactions they cause — whether those reactions are support or outrage. However, this does not preclude increased visibility underlying how algorithms are structuring these platforms to steer user interactions in specific ways. While we might not know the details governing the operation of these black boxes, they produce iterative actions that direct conditions pertaining to information flows, which in turn, will determine the next set of actions taken by the algorithms to process the next move. In other words, the algorithms operate in response to data collected from user activities.

At the same time, algorithms are expected to allow for precise configurations, with the aim of enforcing social categories and liberties that may lead to their own set of exclusionary policies. Polack (2019) suggests that policies underlying algorithmic technologies make two assumptions: that the users the algorithms presumably target are either threatened by, or ignorant, of the technical capabilities of the algorithms involved. However, as we will see in the following sections, even as algorithms serve to amplify the loudest and most dominant trends that enable trollish behaviours, users savvy about cultural and social responsiveness to certain issues weaponize these responses to influence sentiments and provoke disenchantment. While some forms of trolling appear to be in your face and obvious, more sophisticated forms of trolling are able to operate within the constraints of platform policies to retain their legitimacy on these sites while still achieving their objectives; the latter group comprised primarily of mercenaries, although there are those who serve no one but participate in such activities merely to prove a point.

5.1. Trolling as serious critique and for causing controversies

While there are many trolls who would not own up to trolling, there are some who explicitly identify themselves as trolls. One such instance is the aforementioned National Troll Army, known by its Malay nom de plume of Tentera Troll Kebangsaan Malaysia (TTKM); they own both a Facebook page and a Twitter account. Members of this group have had a run in with the notorious Ratu Naga, real name Syarul Ema Rena, who was reportedly a cybertrooper of the previous incumbent Barisan Nasional government before jumping ship into the coalition forming the Pakatan Harapan government, back in 2017 (Jessy, 2018). The existence of such a page is not unique, as there are a number of other troll-like personas on Facebook, mainly in the Malay vernacular, that take aim at various prominent politicians (particularly Malay ones although prominent politicians of other races are not spared). These pages depict an almost literal ownership of trolling as a form of organized performance that garners its own spectatorship and public participation [13].

Key considerations in platform studies include programmability, affordances and constraints, connection of heterogeneous actors, and accessibility of data and logic through application programming interfaces (APIs) [14]. This infrastructure of platforms shapes the kinds of relations that emerge online. Just as it allows users to post potentially incriminating or controversial opinions, it allows abuse to roam free. Many broadsheet newspapers enabled comments on their Web sites and, as those comments grew in number and reach, they became more influential in political debates. Politicians, parties, and vested interests seized this opportunity and began to pay people to post comments supporting their candidates and opinions; or denigrate the opinions and political positions of others. As previously discussed, friction is a method by which less democratic governments attempt to control the information access of their citizens; in order to bypass restrictions that could increase the degrees of friction, Internet users may resort to masking their identities, and mask their IPs, in order to ‘fool’ the filtering system.

In a 2016 interview with Malaysiakini (the interviewer was not identified), the TTKM defends its existence as a need to counter what they consider as biased and defamatory trolling done by the cybertroopers (paid trolls) from various political parties. TTKM also considered the research they had poured into each trollish post, which functions, for them, as critiques of Malaysia’s political environment, the country’s performance of political satire, and a continuation of a tradition they perceived as already existing in Malay popular culture for decades (references were made to the films of popular Malaysian director and musician, the late P. Ramlee). The collective is proud of their media-making skills, and their expertise at manipulating digital content, when needed, to fit with their trollish narratives — this is also practiced by their followers who would create such manipulations as part of their responses. In their interview, they justified the adoption of the word “trolling” to describe their activities — the choice is worthy of further reflections.

Outside of the English-speaking Internet, translations and synonyms of “troll” do not easily map onto the same set of activities (de Seta, 2013). In comparing themselves favorably with Anonymous, another Internet collective, the TTKM collective, believes that they are needed to get politicians to toe the line while maintaining a balance of power among different political sides. Ideas for trolling are taken from trending events or crowd-sourced from their followers. Therefore, politically-focused Internet trolls such as the TTKM facilitate online fellowship among the followers of the page who then participate indirectly in abetting the act of trolling, therefore creating tribal complicity, or even mob-like reactions, that sees trolling as a collective action.

The most prolific form of trolling involves making content that is controversial above all else, regardless of intent, and engages in moralizing with the intent of shaping opinions around targets. In the aftermath of the GE14, there was a proliferation of posts around social media that focused on the more personal aspect of politics. The personal in this instance comes in two forms: the first focuses on the character and personal aspect of the person while the other is about how a person locates their relationship to the political. However, the personal in this instance is emotive rather than revealing. In this instance, the person of the troll is less explicit, even if the actions of such a person continues to be disruptive. However, the question concerns how and what that disruption is supposed to look like. We will look at two examples.

The first is an exposé on the aforementioned Rosmah Mansor, the wife of the sixth Prime Minister Najib Run Razak, by her daughter, Azrene, from a previous marriage. The expos was posted on the latter’s Facebook page, with corroboration by the latter’s husband (Coconuts KL, 2008a). The post made allegations that would have been equivalent to slander under most circumstances unless evidence could be provided: allegations ranged from the personal abuse that Rosmah meted on her daughter and husband, Najib, practice of black magic, illegal monetary transactions from the state coffers, and the intentional destruction of individuals seen as barriers to the power couple’s interest. The post went viral — not only was it reposted on the pages of other individual Facebook pages, but have made their way to WhatsApp (which was where one of the authors first encountered the post) and Web tabloids. Even if the post comes in the form of a personal attack, there was no evidence of doxing since no actual personal details were revealed. However, the emotional content of the post, written at a time when Malaysians were increasingly angered by the doings of Najib and his cohorts, elicited mostly supportive reactions from a public hungry for negative exposé on their former prime minister and his circle.

Whether in the form of wordplay and graphics that draw on the popular trending news or public opinion on Najib, his family, his allies, and their scandals; or professionally made videos by a party from the Pakatan Harapan coalition, Democratic Action Party (DAP), the posting by Azrene could be read as a form of retributory flaming. It did however provide fodder for anti-Najib and anti-Rosmah trolls that run down any attempts to defend the couple, as we could see in the sarcastic reactions to Azrene’s half-sister, Nooryana Najwa, considered by those already prejudiced towards Rosmah as the latter’s defender. As the news-focused Web sites (which are not necessarily digital newspapers or news sites) had been set up to pull content off each other in their competition for the attention economy, original posts, and accompanying commentaries, were recirculated at these Web sites, with no pretense towards upholding any kind of journalistic practice or ethics [15].

The second concerns the controversial #UndiRosak campaign, whereby young voters were encouraged either to stay home or spoil the votes they intended to cast as a sign of protest over the perceived lack of political choices. What is interesting about this largely Twitter-driven campaign is the multiple layers of trolling observed:

  1. The #UndiRosak campaigners are effectively trolling and dissuading the mostly young and urban voters from partaking in vote-casting (especially swing voters and the disenchanted).
  2. The same purveyors are virulently attacked on social media, and various online sites, and cast as irresponsible and pandering to the gerrymandering ways of Najib and his cohorts.
  3. One of the more active #UndiRosak campaigners declared on her personal Twitter account that she had no regrets for the campaign despite the hostile pushback she had received, almost a year after the GE14.

Not all the detractors of this campaign were hostile; some politicians tried to engage with the campaigners and their target voters diplomatically. Syed Saddiq, former Minister of Youth and Sports under the Pakatan Harapan government, tried to use his own personal story to dissuade voters from spoiling their votes (KiniTV, 2018). Nonetheless, the campaigners became the target of trolling activities, with the trolls going as far as to photo-shop the former onto lewd pages — the aforementioned prominent female campaigner, Maryam Lee, bore the brunt of the type of attacks that could be designated as gender trolling [16]; unlike her male counterparts, she was subject to gendered insults (“whore”, “slut”). This form of trolling is no different from the previous two examples we had given relating to an assemblywoman who was publicly sexually harassed (although the harasser did not try to hide his identity), or the aforementioned depiction of Teresa Kok. Gender-trolling is a concerted act of trolling whereby the gender of the target is used to overwhelm, intimidate, and silence. Therefore, regardless of the issue represented by the target, the focus is to use the target’s gender against the former, and women are the main targets of such attacks. Mantilla (2013) provides an extensive explanation of this very specific form of vicious trolling targeted at women speaking out on issues pertinent to gender, though gender-trolling is not confined only to such issues. As was mentioned, gendered and political identities could also be used by trolls in debates that had nothing to do with gender; this is perhaps the result of misconception, bigotry, and pushback against the perception that these women must be kept in their subordinated position, or to silence claims of gendered injustice.

5.2. Meta-trolling in videos by content creators to critique popular political issues

The smartphone generation is also a video-centric generation, and nothing provokes more than a strategically produced video dropped at an opportune time. The cybertroopers of the Barisan Nasional (representing the component parties that made up the government before Pakatan Harapan) had been going at it for a long time due to the resources they had. However, during the elections, Pakatan Harapan (who were still members of the Opposition during the GE14) began to put effort into crafting videos that appeal not only to their supporters, but also to the undecided who were unhappy with the status quo, as well as former supporters of the incumbent increasingly dissatisfied with what they saw as excesses of the ruling class.

These videos deployed everyday images that represent the Malaysian trope, and then used humor, tinged with irony (and a measure of sarcasm), to provoke. Although the issues discussed are obvious and already well-known for the most part, it is the manner of telling that makes a difference in the repetition. As an example, we consider the three-episode 1MDB Chronicles produced by DAP’s social media team, starring the present Member of Parliament of an urban constituency, Tony Pua Kiam Wee, who performed well as an amateur actor in a meme-filled and loud (and almost garish) graphic retelling of all the actors involved in the 1MDB scandal — including how their personal lives had become a part of the money-trail narrative; Tony performed the role of an archetypal character that is culturally familiar to Malaysians. There is nothing particularly cerebral about the video given that the aim is to provoke voters with even the most basic understanding of the issues featured. In a sense, the purpose of the troll is less about the target of the troll but more about performing the troll to the spectators. On its own, the video is entertaining, but when set against other equally, if not more, inflammatory materials, it has a whole new significance (Democratic Action Party, 2018).

5.3. The theatre of debate: The grey area separating trolling from informational reportage

Really, I could ask a journalist the same thing given the similarities between what trolls and journalists do online. The goals are very much alike: to draw public attention to something (often without a full knowledge of it because that’s not the point) and invoking an emotional response from the pub[lic] that misinterprets things, fails to get the full story, or is deliberately provocative, gets angry or otherwise impassioned responses, right? From where I sit, this is done deliberately to get more attention/page views. Same basic model as trolling. Trolls do it for the lulz. Journalists for a paycheck. (Paulie Socash)

What is seen as disinformation to one party is seen as a clever packaging of subliminal propaganda to another. As one of the authors had formerly worked in Malaysia’s media industry, she was privy to how reporters would blindly copy and paste press releases without doing further due diligence — one could therefore consider this to be disinformation spread out of willful laziness. Trolls do not necessarily have to provoke to achieve their objective; rather, they could first sow seeds of discontent through the strategic use of ‘fake news’ and seemingly informative documents that would soon be widely circulated through WhatsApp. In other words, the troll becomes the puppet master who uses amplification to create reactions.

The troll’s existence depends crucially on the rules, norms, codes of conduct (both codified and implicit or assumed), etiquette, a ‘theatre of debate’ or a game in which established rules are, by and large, obeyed, even if the outcome might still shock the sensibilities of the non-trolls and the trolls’ target. The rules provide a kind of infrastructure that enables the troll to thrive, although their purpose is either to dismantle or disrupt that infrastructure, or to mimic and parody it into absurdity. We can see this in the example of an entry in Encyclopedia Dramatica on Malaysia, which took what is perceived to be the rules and norms governing Malaysia and twisting them around in order to shock and disgust: every form of racism, sexism, taboo, and the unspeakable are set forth in ways to blatantly and intentionally disrespect. However, within the offensive rhetoric are attempts to demonstrate, through issues that have become their own memes, the dysfunctionality of a society stripped off its niceties. Therefore, even if the content may seem offensive to those unused to the linguistic extremities of the site, rule-following is still taking place.

Encyclopedia Dramatica is an example of a platform promoting trolling through popularity obtained via notoriety. In the encyclopedia’s entry on Malaysia, the authors appeared to have carefully crafted words that may seem unintellectual and profanity laced, but were probably chosen to maximize impact and draw attention; the choice of words were supplemented with examples from popular memes taken to their extremities with the intent of producing disgust and therefore, provoking response (Encyclopedia Dramatica, 2015). Trolling loves company. Consider the following statement from the troll, Paulie Socash: “I personally concentrate on making spaces that troll people and where this basic trolling can happen: these usually relate to whatever is sensational in the media. To use a UK example, if you recall [mass shooter] Raoul Moat, I made Facebook pages memorializing him after his death and saying what a great man he was. These drew a lot of angry people who couldn’t believe someone would pay tribute to a murderer. Likewise, I’ve made sites that condemn people for things most are praising them for (I didn’t make it, but the Facebook page ‘Soldiers are not heroes’ is a good example).” One could find similar instances of memorial trolling in the case of Australian military veterans (Online Hate Prevention Institute, 2014), with further examples and analysis given by Whitney Phillips (2011) who argues that such cases are textbook studies in amplification. This means feedback loops between trolls and media coverage respond to the horrific events involved in each case. The increasingly vicious trolling responses amplify one another, leading eventually to stronger action from the host platforms.

These examples illustrate the role that platforms and algorithms play in amplifying drama, the lifeblood of trolls. Drama — often apparent in so-called flame wars — is an underlying logic of trolling. The aim is to involve as many as possible in the drama or to target specific individuals in as dramatic a manner as possible. Such dramatic tensions need not be explicit. Just as often, drama emerges out of the interpretation of supposed ulterior motives where accusations of treachery, traitorousness, and passive-aggressiveness surface. Similarly, one finds that misfortunes announced through or within a culturally Malay social media community always elicit unsolicited religion-based advice from bystanders, even if the manner by which the advice is offered might be perceived as rude and insensitive by those outside the community. Such advice givers are akin to the aforementioned concern trolls. They might seem sincere but could be feeding off negative emotions and soliciting social media wars. The entirely sincere users giving similar advice, but with misguided intentions, enabled the concern troll to be cleverly disguised. Finally, trolls are often quite self-reflexive. They display an awareness of the tropes or styles of a debate and often mimic them in parodic or satirical forms. They might create imaginary characters that reflect back stereotypes, or fashions and styles of speech. Motivations can vary but are often political in nature even when that political ideology is explicitly anarchic or ‘for the lulz.’ Such examples could be found in the figure of Mia Jalipah, an animated avatar created by the TTKM as a talking head who mimics the demeanor of a professional news anchor in order to provide satirical versions of trending Malaysian political news.

However, trolls are not the only ones trying to make space for themselves on social media platforms. Social media now abound with ‘influencers’ who make their living by purportedly selling their personal brand in alignment with the bigger commercial brands, therefore making these influencers the brand ambassador of choice for marketers seeking to locate a recognizable face or personality to ‘personalise’ or make their brand more ‘accessible’ to the ordinary people who are the fans/followers of these influencers (Limkangvanmongkol and Abidin, 2019; Bishop, 2019) [17]. Even in the case of political trolling, politically motivated trolling could only be sustained should there be political influencers to direct the attention of traffic. These social influencers are not adverse to supporting a political candidate of their choice, as we could see in the campaigning of the Philippines’ Duterte and Indonesia’s Jokowi (Lee, 2018).

There are parallels and contrasts between the evolution of the influencers and that of the trolls: the existence of these ‘micro-celebrities’ point to the serendipitous matching between trends, infrastructural timelines (such as when the social media platforms first appeared and what made some platforms more popular than others in the production of the micro-celebrities), and those (namely the social media influencers) who were able to exploit these through strategic self-exposure to gain followers, even if not to the point of attracting serious socio-political controversy. While these influencers thrive on being well-known, the troll usually remains anonymous unless unmasked. Although the influencer may derive financial benefits from their popularity, the troll is more interested in the fame that comes from the accomplishment of a feat; the troll sees anonymity as a sign of control and power over an online population they might be manipulating while the inverse is true for an influencer.

Nevertheless, both the influencer and troll are heterogeneous, and their existence cannot be pinned down to a single motive. Occasionally, the influencer and the trolls collide, such as in the case of Singaporean social media celebrity Xiaxue (Wendy Cheng), who responded to mean comments compiled by her online host, Clicknetwor. Cheng is one of Singapore’s pioneer influencers and currently one of the most followed individuals in the country with over one million followers across Instagram, Twitter, YouTube, and Facebook. She gained notoriety for her controversy-courting posts and public conflicts with rival social media management network Gushcloud (Lim, 2015), satirical Facebook page SMRT, rival bloggers, declaring support for U.S. President Donald Trump, and engaging in what many perceived to be racist attacks on migrant workers. If these social media influencers gain particular cult following due to their positioning within a cultural (or socio-political) particularity that provides them with the desired cultural capital (Abidin and Brown, 2019), the choice of platforms work similarly for amplifying the popularity (notoriety) of influencers and trolls alike. Cheng has continued to shift her focus and sphere of influence, in moving from talking primarily about her daily life and fashion preferences to becoming a commentator on motherhood after her child was born. More recently, she announced her intention to apply to be a Nominated Member of Parliament in the Singapore government (Chia, 2018). She has extended her political commentary to Malaysia by dressing up as Rosmah Mansor during Halloween of 2018 (Coconuts KL, 2018b).

Trolling could also be seen in what is purported to be journalism. In the racially charged politics of Malaysia, ‘mean’ comments and trolls operating in the comment sections of a news site, even on paywalled news sites, would draw on assumptions about others’ identities, if not through the choice of handles, then through the user’s words. Of course, it is not impossible that a majority of the commentators are trolls mimicking the stereotype associated with another community, such as the one in Figure 2, and making insincere comments, or comments that they know would provoke along with posting memes. The meme is representative of what often appears in the comment sections of Malaysian news sites. In the aforementioned case of Rosmah’s daughters, certain news sites that had circulated their accusations and defense respectively had injected their own editorial viewpoints into the case (whether the viewpoints are generated by humans or bots is another matter), as a means of trolling those seen as supportive of Rosmah.

Another Web news site, The Sarawak Report, uses a provocative style of reportage which aims to stir up its readers and the targets of its critique rather than maintain a neutral tone. Naturally, the comment section of its articles would be rather lively in nature. Given the site’s raison d’être, which is to raise awareness on the corruption that was happening in Sarawak, the intent was to use any legitimate means possible to draw the public’s attention to corruption at the state, although its purview has extended beyond the state. Nevertheless, the site is not above using leaked information relating to political figures in order to provoke the admirers of said figures and disenchanted readers alike, such as in the recent case relating to Malaysia’s Minister of Economic Affairs, Azmin Ali, who was accused of living a lavish lifestyle (Brown, 2019), and who had, at the time of writing, already staged a political coup with his co-conspirators that led to the unravelling of Pakatan Harapan (Brown, 2020).

Parody news site, The Tapir Times (a Malaysian site similar to The Onion in the U.S.) specializes in parodying trending issues in Malaysia, political or otherwise. However, unlike the Sarawak Report, The Tapir Times takes a light and humorous approach to the weighty issues it chooses to cover. Even so, the import of the issues selected for coverage were not lost on its readers [18]. While The Sarawak Report and The Tapir Times each has its own way of bringing to bear, content on socio-economic and political corruptions, another site, World of Buzz, injects editorial commentary in the manner of a concern troll into especially curated sensational stories meant to elicit disgust, curiosity, and no small amount of voyeurism. Web-based news sites are hosted on a platform deploying a recommendation system capable of amplifying emotional reactions and provoking responses instantly. The hermeneutical credibility of a troll is derived from its ability to justify its trolling activities by articulating a point of view that seems convincing (Fricker, 2007). For example, the troll may have a reason to believe that the target is hiding important information that concerns public safety and welfare; if the troll has either partial or full access to what this information may be, they could choose to troll the target into admitting that subterfuge and into releasing the information. However, the trolled (which could be the government or an organization with public authority) may have kept the information classified with the rationale of protecting the public. The continued existence and growing number of trollish pages and accounts in the Malay vernacular demonstrated that they have succeeded in making a space for themselves in platforms where they would previously have been considered, in terms of absolute numbers, as inconsequential players.

 

++++++++++

6. Concluding remarks

 

This meme contains references to Malaysia's socio-political and religious climate
 
Figure 2: This meme contains references to Malaysia’s socio-political and religious climate with references to terms such as Gerak Khas (Special Police Force), lokap (lockup), Kancil (the smallest national car ever produced by Malaysia), Semenanjung Malaysia (Peninsular Malaysia), PDRM (Royal Malaysia Police Force), lanciau (swear word for penis), budak (kid), silat (martial arts). The mixing between Malay and English in such trollish memes are common. The language is coarse and threatening and may lead the unsuspecting target to think he/she is at the receiving end of a death threat.
 

 

Understanding political trolling in Malaysia has implications beyond the national. It reveals the relationship between digital platform users, amplification, and algorithms in the context of a postcolonial Asian society in reproducing and mirroring the injustices one might observe in Internet spaces; but doing so through the terms established within the Malaysian context. Malaysia Internet political trolls depend on platforms supporting and cultivating the practice of trolling that involves the posting, meta-trolling, and deployment of memes. Such platforms are adopted with local variations but have much of the underlying infrastructure that is consistent across borders. Platforms underpin the relations between trolls and their targets by spreading the performance and effects of these interactions beyond their immediate networks. Cyber-trooping and astroturfing are acts of flooding, with disinformation campaigns designed to influence public perception. The design of platforms that are intentionally structured to allow for interactions that appear intimate, open windows to exploit human fallibility and vulnerability to disinformation and misinformation. Fellowship, intimacy, and facelessness are not, or not only, relationships generated by the intersubjectivity of users, but features of underlying systems of pseudonymity and anonymity produced by the platform infrastructure. Attention to Internet political trolling must also understand how all of the above relations occur, and are maintained, through the convoluted ways by which they influence one another. End of article

 

About the authors

Clarissa Ai Ling Lee, Jeffrey Cheah Institute of Southeast Asia, Sunway University.
Corresponding author: clarissal [at] sunway [dot] edu [dot] my

Eric Kerr, Asia Research Institute and Tembusu College, National University of Singapore.
E-mail: eric [dot] kerr [at] nus [dot] edu [dot] sg

 

Notes

1. Pre-digital examples are the outrageous April Fool’s pranks that either managed to convince their targets, or co-opted their targets to be party to the prank (Little, 2019); or one that stirred the public imagination in such extreme ways that it became their reality (Schwartz, 2015). The latter story is possibly familiar to those who have experienced various forms of online gaslighting. Nevertheless, in both cases, the success of the incidents highlighted here were premised on how successfully their creators have built on existing sentiments, drawn on the anxieties of a Zeitgeist, or created an alternative that is still psychologically connected to a familiar world.

2. The authors have explored the different configurations of trolling in a recently published article, where they suggested that the trolls made use of exploits to provoke a desired outcome on the digital infrastructures the trolls inhabit (see pp. 6–8; Kerr and Lee, 2019). In referring to parody or satire, we have in minds the work of certain artists who use cartoons as a form of political commentary, such as Fahmi Reza and Zunar. However, their form of baiting is sufficiently explicit that those immersed within the political culture of Malaysia could easily identify the target of their political baiting.

3. Aouragh and Chakravartty, 2016, pp. 562–564.

4. At this time of writing, the Pakatan Harapan government is no more, and has been replaced by Perikatan Nasional. The political machinations of this are not within the scope of this article.

5. We thank the reviewers for drawing our attention to this particular Malaysian group.

6. The same kind of racialized/racist political platform that the Perikatan Nasional runs on.

7. There are a number of publications that discussed the special position of the Malay race, with its specific political and cultural significance within the modern context such as in Gomez (2007). The same volume also contains details on the coalition-based party system that has been practiced in Malaysia since its first general elections. Pakatan Harapan is a coalition that renounces race-based identities in its formation, with two of the biggest component members coming from parties that were meant to move away from race-based identification. However, a number of the coalition’s component parties were still set up based on such identification despite attempts to downplay the obvious. See (Lemière, 2018) for a fuller discussion on the fall of the Barisan Nasional, and the circumstances that allowed Pakatan Harapan to come to power. To have an overview of the present context of racialized violence that galvanized electoral behavior between 2013 and 2018, also considered as pivotal years of gradual reversals in national sentiments on racial politics, see (Quay, 2013).

8. Lim, 2018, p. 478.

9. Examples of such references could be found in the infamous Malaysian public forum, Lowyat.net (https://forum.lowyat.net/topic/4587257/all).

10. In March 2020, a Malay-identified male Twitter user by the handle @NieRezati1973 publicly sexually-harassed a Malay state-assemblywoman (of Pakatan Harapan, and the party DAP) by insinuating that she is ‘fuckable from behind’ (the word used was “tonggeng”). The fiasco led to the culprit’s embarrassed daughter apologizing to the state assemblywoman on her father’s behalf. The entire drama was played out in the assemblywoman’s Twitter timeline.

11. This is not to say that one will not find Chinese-speaking trolls who tend to be more focused in sites relating to trading; be it the trade of information (Lowyat.net), stocks (the KLSE investor.com), and various Facebook pages. However, as our purpose in this article is to focus on how the study of Malaysian political Internet trolling can contribute to our understanding of digital infrastructures as underpinned by platforms and algorithms, the particularities of these different groups are less crucial to our consideration here even if the content of trolling is reflective of the political climate.

12. In recent years, new organizations have emerged to consider Internet policies and the reproduction of real-world injustices within the sphere of the Internet. Among such organizations are GenderIT and Digital Hub Asia providing different and intersectional perspectives on Internet communities.

13. https://www.facebook.com/pg/tenteratrollkebangsaanmalaysia/reviews/?referrer=page_recommendations_see_all.

14. Plantin, et al., 2016, p. 294.

15. An example of stories being reproduced across different Web sites is the aforementioned sob story posted by Rosmah’s daughter. A search on Google will reveal that story posted on Coconuts KL is similar, with merely some editorial injections, to the story that came out in the World of Buzz on 11 May 2018, see Thiagarajan (2018).

16. One could see the enactment of gender trolling among the Malay-speaking netizens through the example of a 2015 vicious trolling incident affecting the female host, Aisyah Tajuddin, of a then popular show called BFM Kupas. In a segment she hosted, she did a satirical skit on the flood disaster affecting Kelantan, a state located on the east coast of Peninsular Malaysia, in relation to the state’s Islamist agenda. The segment took a satirical jab at the state government over the latter’s misguided priorities; the latter was criticized as being more concerned with the implementation of a rigid Islamist regime (the implementation of the Hudud law) governing crime and morality than in attending to the economic issues affecting the people living in one of the poorest states in Malaysia. The video-on-demand went viral and spiraled into mob lynching from certain segments of the Malay social media community; Aisyah Tajuddin received multiple death and rape threats. Clips of the video that started the backlash and the ensuing pandemonium were featured in a 30-minute Malay-language documentary named Viral Sial. The focus of the documentary was on how mob rule, driven by the thrill of collective cyber ambush, turned a religious disagreement into a collective act of lynching, further fueled by social media amplification of reactions. See Aliran Admin (2017) for details on of the documentary, and Su-Lyn (2015) for a sampling of some of the threats received. To understand the psychological trauma of the victim of such vicious gender-trolling, see the interview with her father (The Star, 2015). The aforementioned Maryam Lee (2017) of #UndiRosak also wrote an op-ed on the issue, finding commonality between her personal experiences with what Aisyah Tajuddin experienced.

17. Micro-celebrities or “influencers” are a form of opinion formers. Crystal Abidin, who has conducted extensive research on influencers in southeast Asia, refers to them as “everyday, ordinary Internet users who accumulate a relatively large following on blogs and social media through the textual and visual narration of their personal lives and lifestyles, engage with their following in digital and physical spaces, and monetize their following by integrating ‘advertorials’ into their blog or social media posts.” (Abidin, 2015).

18. The publication of that news on Azmin Ali became the starting point of a series of controversies that culminated in his defection from the Pakatan Harapan Coalition. He became instrumental in the formation of the present Perikatan Nasional that is balancing on a tightrope at this time of writing as it had barely settled the controversial beginning of its formation before having to deal with the COVID-19 crisis. By the time the Pakatan Harapan government fell, the machinery producing trollish memes were working overtime.

 

References

N. Abdullah and A. Anuar, 2018. “Old politics and new media: Social media and Malaysia’s 2018 elections,” The Diplomat (8 May), at https://thediplomat.com/2018/05/old-politics-and-new-media-social-media-and-malaysias-2018-elections/, accessed 21 May 2020.

C. Abidin, 2015. “Communicative ❤ intimacies: Influencers and perceived interconnectedness,” Ada, number 8.
doi: https://doi.org/10.7264/N3MW2FFG, accessed 21 May 2020.

C. Abidin and M.L. Brown, 2019. “Introduction,” In: C. Abidin and M.L. Brown (editors). Microcelebrity around the globe: Approaches to cultures of Internet fame. Wagon Lane, Bingley: Emerald Publishing, pp. 1–18.
doi: https://doi.org/10.1108/978-1-78756-749-820181001, accessed 21 May 2020.

Aliran Admin, 2017. “Viral, Sial! — Tayangan dokumentari/documentary screening” (14 March), at https://aliran.com/events/viral-sial-tayangan-dokumentaridocumentary-screening/, accessed 21 May 2020.

M. Aouragh and P. Chakravartty, 2016. “Infrastructures of empire: Towards a critical geopolitics of media and information studies,” Media, Culture & Society, volume 38, number 4, pp. 559–575.
doi: https://doi.org/10.1177/0163443716643007, accessed 21 May 2020.

J.P. Barlow, 1996. “A declaration of the independence of cyberspace,” Electronic Frontier Foundation (8 February), at https://www.eff.org/cyberspace-independence, accessed 21 May 2020.

S. Bishop, 2019. “Vlogging parlance: Strategic talking in beauty vlogs,” In: C. Abidin and M.L. Brown (editors). Microcelebrity around the globe: Approaches to cultures of Internet fame. Wagon Lane, Bingley: Emerald Publishing, pp. 21–32.
doi: https://doi.org/10.1108/978-1-78756-749-820181002, accessed 21 May 2020.

C.R. Brown, 2020. “Malaysia’s meltdown moment — The inside story,” Sarawak Report (24 February), at https://www.sarawakreport.org/2020/02/malaysias-meltdown-moment-inside-story/, accessed 21 May 2020.

C.R. Brown, 2019. “Azmin Ali taken to court over unpaid Sandakan travel bill — Exclusive,” Sarawak Report (5 November), at https://www.sarawakreport.org/2019/11/azmin-ali-taken-to-court-over-unpaid-sandakan-travel-bill-exclusive/, accessed 21 May 2020.

L. Cameron and M. Martinez, n.d. “Lurkers, extroverts, and ambassadors: As Facebook upends publishing world with new ranking algorithm, here’s what recent research says about the social giant and engagement,” IEEE Computer Society, at https://www.computer.org/publications/tech-news/trends/ieee-computer-society-research-into-facebook-news-feed-algorithm-social-interaction, accessed 21 May 2020.

J. Cheney-Lippold, 2017. We are data: Algorithms and the making of our digital selves. New York: New York University Press.

L. Chia, 2018. “NMP nominations: 48 proposal forms received by Parliament,” Channel News Asia (6 July), at https://www.channelnewsasia.com/news/singapore/nmp-nominations-48-proposal-forms-received-by-parliament-10506154, accessed 21 May 2020.

S. Chinnasamy and Norain Abdul Manaf, 2018. “Social media as political hatred mode in Malaysia’s 2018 general election,” SHS Web of Conferences, volume 53.
doi: https://doi.org/10.1051/shsconf/20185302005, accessed 21 May 2020.

Coconuts KL, 2018a. “Rosmah’s two daughters take to social media to offer differing words on mother’s arrest” (4 October), at https://coconuts.co/kl/news/rosmahs-two-daughter-take-social-media-offer-differing-words-mothers-arrest/, accessed 21 May 2020.

Coconuts KL, 2018b. “Xiaxue uploads her Rosmah Mansor Halloween makeup tutorial, receives approval from Rosmah’s son-in-law” (19 October), at https://coconuts.co/kl/news/xiaxue-uploads-rosmah-mansor-halloween-makeup-tutorial-receives-approval-rosmahs-son-law-2/, accessed 21 May 2020.

B.A. Coles and M. West, 2016. “Trolling the trolls: Online forum users constructions of the nature and properties of trolling,” Computers in Human Behavior, volume 60, pp. 233–244.
doi: https://doi.org/10.1016/j.chb.2016.02.070, accessed 21 May 2020.

K. Crawford, R. Dobbe, T. Dryer, G. Fried, B. Green, E. Kaziunas, A. Kak, V. Mathur, E. McElroy, A.N. Sanchez, D. Raji, J.L. Rankin, R. Richardson, J. Schultz, S.M. West, and M. Whittaker, 2019. “AI Now 2019 report,” at https://ainowinstitute.org/AI_Now_2019_Report.pdf, accessed 21 May 2020.

G. de Seta, 2013. “Spraying, fishing, looking for trouble: The Chinese Internet and a critical perspective on the concept of trolling,” Fibreculture, number 22, at http://twentytwo.fibreculturejournal.org/fcj-167-spraying-fishing-looking-for-trouble-the-chinese-internet-and-a-critical-perspective-on-the-concept-of-trolling/, accessed 21 May 2020.

J. Dibbell, 1996. “A rape in cyberspace: Or how an evil clown, a Haitian trickster spirit, two wizards, and a cast of dozens turned a database into a society,” In: P. Ludlow (editor). High noon on the electronic frontier: Conceptual issues in cyberspace. Cambridge, Mass.: MIT Press, pp. 375–395.

Democratic Action Party, 2018. “The 1MDB Chronicles. Episode 1: The Donation” (6 May), at https://www.facebook.com/DAPMalaysia/videos/10156260562594192?sfns=mo, accessed 21 May 2020.

Encyclopedia Dramatica, 2015. “Malaysia,” at https://encyclopediadramatica.fyi/index.php/Malaysia, accessed 10 May 2020.

M. Fricker, 2007. Epistemic injustice: Power and the ethics of knowing. Oxford: Oxford University Press.
doi: https://doi.org/10.1093/acprof:oso/9780198237907.001.0001, accessed 21 May 2020.

B.M. Frischmann, 2012. Infrastructure: The social value of shared resources. Oxford: Oxford University Press.
doi: https://doi.org/10.1093/acprof:oso/9780199895656.001.0001, accessed 21 May 2020.

J. Gainous, K.M. Wagner, and J.P. Abbott, 2015. “Civic disobedience: Does Internet use stimulate political unrest in east Asia?” Journal of Information Technology & Politics, volume 12, number 2, pp. 219–236.
doi: https://doi.org/10.1080/19331681.2015.1034909, accessed 21 May 2020.

G. Goggin and M. McLelland, 2009. “Internationalizing Internet studies: Beyond anglophone paradigms,” In: G. Goggin and M. McLelland (editors). Internationalizing Internet studies: Beyond anglophone paradigms. London: Routledge, pp. 3–17.

E.T. Gomez (editor), 2007. Politics in Malaysia: The Malay dimension. London: Routledge.

E. Graham, 2019. “Boundary maintenance and the origins of trolling,” New Media & Society, volume 21, number 9, pp. 2,029–2,047.
doi: https://doi.org/10.1177/1461444819837561, accessed 21 May 2020.

J. Hopkins, 2014. “Cybertroopers and tea parties: Government use of the Internet in Malaysia,” Asian Journal of Communication, volume 24, number 1, pp. 5–24.
doi: https://doi.org/10.1080/01292986.2013.851721, accessed 21 May 2020.

I.N. Ibrahim, 2019. “Najib has found true calling as ‘king of trolls’, Kit Siang suggests,” Malay Mail (20 January), at https://www.malaymail.com/news/malaysia/2019/01/20/najib-found-true-calling-as-king-of-trolls-kit-siang-suggests/1714583, accessed 21 May 2020.

R. Jessy, 2018. “Remember the time Mahathir said Kit Siang was racist?” (6 October), at https://www.thethirdforce.net/video-remember-the-time-mahathir-said-kit-siang-was-racist/, accessed 21 May 2020.

A. Johns, 2020. “‘This will be the WhatsApp election’: Crypto-publics and digital citizenship in Malaysia’s GE14 election,” First Monday, volume 25, number 1, at https://firstmonday.org/article/view/10381/8298, accessed 21 May 2020.
doi: http://dx.doi.org/10.5210/fm.v25i1.10381, accessed 21 May 2020.

A. Johns and N. Cheong, 2019. “Feeling the chill: Bersih 2.0, state censorship, and ‘networked affect’ on Malaysian social media 2012–2018,” Social Media + Society (19 May).
doi: https://doi.org/10.1177/2056305118821801, accessed 21 May 2020.

M.F. Kasmani, 2019. “A political discourse analysis of the twitter posts of @najibrazak prior to 2018 general elections,” SEARCH (Malaysia), volume 11, number 2, pp. 129–143, article number 8, at http://search.taylors.edu.my/arc-11-2.html, accessed 21 May 2020.

E. Kerr and C.A.L. Lee, 2019. “Trolls maintained: Baiting technological infrastructures of informational justice,” Information, Communication & Society (28 May).
doi: https://doi.org/10.1080/1369118X.2019.1623903, accessed 21 May 2020.

KiniTV, 2018. “I was once like you, Syed Saddiq tells #Undirosak backers” (26 January), at https://www.youtube.com/watch?v=k5KB8jQZLEs, accessed 21 May 2020.

C.A.L. Lee, 2018. “New media and politics in southeast Asia: Social media, citizens and the digital revolution” (25 January), at http://jci.edu.my/wp-content/uploads/2018/02/RossTapsell_NewMedia.pdf, accessed 21 May 2020.

M. Lee, 2017. “Gossip sites, I do not forgive you,” Malaysiakini (9 November), at https://www.malaysiakini.com/columns/401306, accessed 21 May 2020.

S. Lemière, 2018. “The downfall of Malaysia’s ruling party,” Journal of Democracy, volume 29, number 4, pp. 114–128.
doi: https://doi.org/10.1353/jod.2018.0067, accessed 21 May 2020.

P.P.Y. Leong, 2019. Malaysian politics in the new media age: Implications on the political communication process. Singapore: Springer Singapore.
doi: https://doi.org/10.1007/978-981-13-8783-8, accessed 21 May 2020.

M. Lim, 2018. “Disciplining dissent: Freedom, control, and digital activism in southeast Asia,” In: R. Padawangi (editor). Routledge handbook of urbanization in southeast Asia. London: Routledge, pp. 478–494.
doi: https://doi.org/10.4324/9781315562889, accessed 21 May 2020.

Y.H. Lim, 2015. “Singtel and Gushcloud say sorry for negative marketing campaign,” Straits Times (17 March), at https://www.straitstimes.com/singapore/singtel-and-gushcloud-say-sorry-for-negative-marketing-campaign, accessed 21 May 2020.

V. Limkangvanmongkol and C. Abidin, 2019. “Net idols and beauty bloggers’ negotiations of race, commerce, and cultural customs: Emergent microcelebrity genres in Thailand,” In: C. Abidin and M.L. Brown (editors). Microcelebrity around the globe: Approaches to cultures of Internet fame. Wagon Lane, Bingley: Emerald Publishing, pp. 95–106.
doi: https://doi.org/10.1108/978-1-78756-749-820181009, accessed 21 May 2020.

D. Linvill and P. Warren, 2019. “That uplifting tweet you just shared? A Russian troll sent it,” Rolling Stone (25 November), at https://www.rollingstone.com/politics/politics-features/russia-troll-2020-election-interference-twitter-916482/, accessed 21 May 2020.

B. Little, 2019. “9 outrageous pranks in history,” History Stories (1 April), at https://www.history.com/news/april-fools-day-greatest-pranks, accessed 21 May 2020.

P. Lo Presti, 2020. “Persons and affordances,” Ecological Psychology, volume 32, number 1, pp. 25–40.
doi: https://doi.org/10.1080/10407413.2019.1689821, accessed 21 May 2020.

Malaysiakini, 2019. “‘King of trolls’ Najib strikes again, seeks ‘permit’ to selfie in Langkawi” (3 February), at https://www.malaysiakini.com/news/462712, accessed 21 May 2020.

K. Mantilla, 2013. “Gendertrolling: Misogyny adapts to new media,” Feminist Studies, volume 39, number 2, pp. 563–570.

>

A.E. Marwick, 2019. “The algorithmic celebrity: The future of Internet fame and microcelebrity studies,” In: C. Abidin and M.L. Brown (editors). Microcelebrity around the globe: Approaches to cultures of Internet fame. Wagon Lane, Bingley: Emerald Publishing, pp. 161–159.
doi: https://doi.org/10.1108/978-1-78756-749-820181015, accessed 21 May 2020.

R. Meyer, 2016. “Facebook purges journalists, immediately promotes a fake story for 8 hours,” Atlantic (29 August), at http://www.theatlantic.com/technology/archive/2016/08/facebook-steps-in-it/497915/, accessed 13 September 2016.

S. Milano, M. Taddeo, and L. Floridi, 2020. Recommender systems and their ethical challenges, AI & Society (27 February).
doi: https://doi.org/10.1007/s00146-020-00950-y, accessed 21 May 2020.

A. Mitchell, J. Kiley, J. Gottfried, and E. Guskin, 2013. “The role of news on Facebook,” Pew Research Center (24 October), at http://www.journalism.org/2013/10/24/the-role-of-news-on-facebook/, accessed 21 May 2020.

M. Murphy, 2016. “After replacing human editors with an algorithm, Facebook ended up with a fake trending news story,” Quartz (29 August), at http://qz.com/769002/after-replacing-human-editors-with-an-algorithm-facebook-ended-up-with-a-fake-trending-news-story/, accessed 21 May 2020.

M.M.N. Nadzri, 2018. “The 14th general election, the fall of Barisan Nasional, and political development in Malaysia, 1957–2018,” Journal of Current Southeast Asian Affairs, volume 37, number 3, pp. 139–171.
doi: https://doi.org/10.1177/186810341803700307, accessed 21 May 2020.

S.U. Noble, 2018. Algorithms of oppression: How search engines reinforce racism. New York: New York University Press.

B. Noveck, 2005. “A democracy of groups,” First Monday, volume 10, number 11, at https://firstmonday.org/article/view/1289/1209, accessed 21 May 2020.
doi: https://doi.org/10.5210/fm.v10i11.1289, accessed 21 May 2020.

C. O’Neil, 2016. Weapons of math destruction: How big data increases inequality and threatens democracy. New York: Crown.

A. Oboler, K. Welsh, and L. Cruz, 2012. “The danger of big data: Social media as computational social science,” First Monday, volume 17, number 7, at https://firstmonday.org/article/view/3993/3269, accessed 21 May 2020.
doi: https://doi.org/10.5210/fm.v17i7.3993, accessed 21 May 2020.

J.C. Ong and J.V.A. Cabanes, 2018. “Architects of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines,” Newton Tech4dev Network, University of Leeds, at https://newtontechfordev.com/wp-content/uploads/2018/02/ARCHITECTS-OF-NETWORKED-DISINFORMATION-FULL-REPORT.pdf, accessed 21 May 2020.

Online Hate Prevention Institute, 2014. “Hate against military servicemen,” (12 October), at https://ohpi.org.au/hate-against-military-servicemen/, accessed 21 May 2020.

J. Osofsky, 2016. “Information about trending topics,” Facebook Newsroom (12 May), at https://about.fb.com/news/2016/05/information-about-trending-topics/, accessed 21 May 2020.

A. Petri, 2014. “Enter the concern troll,” Washington Post (14 January), at https://www.washingtonpost.com/blogs/compost/wp/2014/01/13/enter-the-concern-troll/, accessed 21 May 2020.

W. Phillips, 2018. “The oxygen of amplification: Better practices for reporting on extremists, antagonists, and manipulators,” at https://datasociety.net/wp-content/uploads/2018/05/FULLREPORT_Oxygen_of_Amplification_DS.pdf, accessed 21 May 2020.

W. Phillips, 2011. “LOLing at tragedy: Facebook trolls, memorial pages and resistance to grief online,” First Monday, volume 16, number 12, , at https://firstmonday.org/article/view/3168/3115, accessed 21 May 2020.
doi: https://doi.org/10.5210/fm.v16i12.3168, accessed 21 May 2020.

J.-C. Plantin, C. Lagoze, P.N. Edwards, and C. Sandvig, 2016. “Infrastructure studies meet platform studies in the age of Google and Facebook,” New Media & Society, volume 20, number 1, pp. 293–310.
doi: https://doi.org/10.1177/1461444816661553, accessed 21 May 2020.

P. Polack, 2019. “AI discourse in policing criticisms of algorithms,” at https://eventalaesthetics.net/wp-content/uploads/2019/11/EAV8_2019_Polack_Policing_Algorithms_57_92.pdf, accessed 21 May 2020.

C.H.L. Quay, 2013. “Ways of seeing Malaysia — Deconstructing demographic violence,” New Mandala (17 May), at https://www.newmandala.org/ways-of-seeing-malaysia-deconstructing-demographic-violence/, accessed 21 May 2020.

M.E. Roberts, 2018. Censored: Distractions and diversions inside Chinas great firewall. Princeton, N.J.: Princeton University Press.

N. Saat, 2018. “Commentary: Malaysia’s anti-ICERD rally a reality check for Pakatan Harapan,” Channel News Asia (16 December), at https://www.channelnewsasia.com/news/commentary/icerd-rally-malaysia-kuala-lumpur-race-mahathir-bersatu-11031060, accessed 21 May 2020.

J. Sombatpoonsiri, 2018. “Rethinking civil resistance in the face of rightwing populism: A theoretical inquiry,” Journal of Peacebuilding & Development, volume 13, number 3, pp. 7–22.
doi: https://doi.org/10.1080/15423166.2018.1496028, accessed 21 May 2020.

B. Schneier, 2019. “8 ways to stay ahead of influence operations,” Foreign Policy (12 August), at https://foreignpolicy.com/2019/08/12/8-ways-to-stay-ahead-of-influence-operations/, accessed 21 May 2020.

A.B. Schwartz, 2015. “The infamous ‘War of the Worlds’ radio broadcast was a magnificent fluke,” Smithsonian (6 May), at https://www.smithsonianmag.com/history/infamous-war-worlds-radio-broadcast-was-magnificent-fluke-180955180/, accessed 21 May 2020.

I. Sholihyn, 2020. “Internet blooms with political memes in midst of PM Mahathirs shock resignation,” Asia One (24 February), at https://www.asiaone.com/digital/internet-blooms-political-memes-midst-pm-mahathirs-shock-resignation, accessed 21 May 2020.

L.M. Smith, 2019. “Malaysia passes law to scrap anti-fake news act — After debate which saw bickering and ‘sexist’ remark made,” Business Insider (10 October), at https://www.businessinsider.my/malaysia-passes-law-to-scrap-anti-fake-news-act-after-debate-which-saw-bickering-and-sexist-remark-made/, accessed 21 May 2020.

The Star, 2015. “Dad of Hudud video presenter: Are we safe to raise our children?” (20 March), at https://www.thestar.com.my/news/nation/2015/03/20/father-of-hudud-video-presenter-writes-of-safety-of-children, accessed 21 May 2020.

Statcounter, 2020. “Social media stats Malaysia,rdquo; at https://gs.statcounter.com/social-media-stats/all/malaysia, accessed 21 May 2020.

B. Su-Lyn, 2015. “BFM journalist gets death, rape threats over video questioning hudud,” Malay Mail (20 March), at https://www.malaymail.com/news/malaysia/2015/03/20/bfm-journalist-gets-death-rape-threats-over-video-questioning-hudud/863097, accessed 21 May 2020.

R. Tapsell, 2018. “The smartphone as the ‘weapon of the weak’: Assessing the role of communication technologies in Malaysia’s regime change,” Journal of Current Southeast Asian Affairs, volume 37, number 3, pp. 9–29.
doi: https://doi.org/10.1177/186810341803700302, accessed 21 May 2020.

T. Thiagarajan, 2018. “Rosmah’s daughter exposes disturbing deets about allegedly abusive, selfish & greedy mother,” World of Buzz (11 May), at https://worldofbuzz.com/rosmahs-daughter-exposes-disturbing-deets-about-allegedly-abusive-selfish-greedy-mother/, accessed 21 May 2020.

C. White, 2013. “Why do Internet trolls exist?” Mashable (10 February), at https://mashable.com/2013/02/10/internet-trolls/, accessed 21 May 2020.

 


Editorial history

Received 28 March 2020; revised 11 May 2020; accepted 20 May 2020.


Copyright © 2020, Clarissa Ai Ling Lee and Eric Kerr. All Rights Reserved.

Trolls at the polls: What cyberharassment, online political activism, and baiting algorithms can show us about the rise and fall of Pakatan Harapan (May 2018–February 2020)
by Clarissa Ai Ling Lee and Eric Kerr.
First Monday, Volume 25, Number 6 - 1 June 2020
https://firstmonday.org/ojs/index.php/fm/article/download/10704/9551
doi: http://dx.doi.org/10.5210/fm.v25i6.10704