First Monday

When the machine hails you, do you turn? Media orientations and the constitution of digital space by Nelanthi Hewa



Abstract
Machines have gone by many names, both in and outside of media theories. They have been called tools, prosthetics, auxiliary organs, and more. This paper explores what happens when we think of media as orientating devices. Sara Ahmed (2006) attends to the way orientations — sexual orientations, but also orientations as ways of being in the world more generally — come to be, and come to be felt on the body. Though Ahmed does not speak of media specifically, her queer phenomenology offers new ways of thinking about media. Media can be thought of as devices that orient, and that turn the body in one direction and away from another. Indeed, a media phenomenology is particularly useful in grounding both the body in media and the media’s felt effects on the body. As scholars increasingly stress, the language used to describe media often obfuscates their materiality, with words like ”virtual“ or even ”Web” concealing the material realities of digital networks. Beyond the materiality of media themselves, however, a phenomenology of media attends to the relationship between media and the bodies that turn to — and are turned — by them.

Contents

I. The orientating device
II. Extending theory; theories of extension
III. Public spaces and exposure
IV. Private spaces and exclusion
V. Orientations and deliberate design
VI. Endings, openings

 


 

I. The orientating device

Instead, we hear the hail, and even feel its force on the surface of the skin, but we do not turn around, even when those words are directed at us. Having not turned around, who knows where we might turn.
— Sara Ahmed, Queer phenomenology

Machines have gone by many names, both in and outside of media theories. They have been called tools, prosthetics (McLuhan, 1964), infrastructure (Peters, 2015), and more. This paper explores what happens when we think of media as orientating devices. In Queer phenomenology, Sara Ahmed (2006) attends to the way orientations — sexual orientations, but also orientations as ways of being in the world more generally — come to be, and come to be felt on the body. Though Ahmed does not speak of media specifically, her queer phenomenology offers new ways of thinking about media. Media can be thought of as devices that orient, and that turn the body in one direction and away from another. Indeed, a media phenomenology is particularly useful in grounding both the body in media and the media’s felt effects on the body. As scholars increasingly stress, the language used to describe media often obfuscates their materiality, with words like ”virtual“ or even ”Web” concealing the material realities of digital networks. Wendy Chun notes that the concept of a “network” itself has become increasingly malleable even as it extends a neoliberal imaginary of “nodes and edges.” [1] Under the ethereal blanket of the linguistic “cloud” are the electrical circuitry that are moved, the resources that are mined, the human labourers that sweat, and the environmental degradation that follows in the wake of technological expansion (Hu, 2015; Hogan, 2018). Beyond the materiality of media themselves, however, a phenomenology of media attends to the ways that media, and their orientations, matter: contrary to the notion of “actors” and “acted upon,” bodies and media can hardly be so neatly extricated from one another. Bodies become oriented towards some objects through repetition — as a typist is turned towards a keyboard — and take on a certain shape through being oriented — as a typist develops a sore back [2]. Bodies are thus directly implicated in orientations in how they are turned and strained, but also, as I will explore, in what bodies come to be familiar and “at-hand.”

Since phenomenologists have often found tables to be generative places to begin (Husserl, 1982; Heidegger, 1962; Merleau-Ponty, 1962), we might follow in their stead (particularly since a table, like Twitter, is also a platform). As Ahmed (2006) argues, when we are oriented towards the table, we are also oriented away from what is behind the table, so that to queer one’s orientation would be to ask: who can sit at the head of the table? Who is under it? Who keeps it clean, and who is bereft of a seat? Ahmed’s phenomenology, as a materialist media theory, can prompt us to ask similar questions of the media and platforms at which we sit. When one uses media, how does it orient us? Or, when media call to us, how are we expected to respond? A user that turns towards a machine’s hail might be its ideal user, one who fits comfortably into its contours. In being hailed by media, however, what comes to be the “background” or the private? Studying media by looking at their orientations allows us, I argue, to see what bodies are consigned to private and shadowy places from which we are turned away. Media in their orientations are thus central to how public spaces come to be manifest online. The bodies that fade into the background might fade in order to keep digital spaces “safe,” “clean,” and “public”; equally, those bodies may be the very ones that do the invisible work of cleaning. Indeed, this kind of “turning away” is hardly new: Saskia Sassen (1991) has explored the way low-paid, often immigrant women perform the work of cleaning and maintenance that enable the greater mobility of others in the so-called “new economy.” Like the middle-class houses that, Sassen notes, are hardly private spaces of leisure but places of work for women of colour, media can construct spaces that mirror the hierarchies of the respectable family: a public, masculine front that is fit for visitors, and a back that is the confines of dark feminized labour. Here, the feminized can mean not just women but the queer, racialized, working class, and otherwise non-normative bodies that make up the background of digital platforms. By examining how media turn our bodies both away from and towards certain bodies, both extend certain bodies and turn away others, I seek to bring different feminist conversations in technology together, and to explore the ghosts in the machines of digital media.

 

++++++++++

II. Extending theory; theories of extension

When media are called tools, it is typically to evoke that they are lifeless objects whose use and usefulness is limited only by the creativity of their users. If it is sharpened and cuts someone’s face, if someone is doxxed or their data is quietly harvested and auctioned off, that harm is “incidental” (Weigel, 2018). Of course, a number of media theorists have resisted the ontology of the tool, not least Marshall McLuhan (1964), whose work articulated the ways that media extend the senses of his titular Man. Subsequent feminist and anticolonial readings of McLuhan have proved fruitful in extending his thinking into spaces he would not have thought to go: in “Exit and the Extensions of Man,” Sarah Sharma (2017) reads against the grain of McLuhan’s media theory in order to produce a feminist materialist media analysis. Her re-reading calls attention to the specificity of the body that media extend. It is the white, male body that has been extended, an argument similarly echoed by Armond Towns (2019) when he argues that McLuhan’s “man” is no universal “human” but the white Western Man, one who is always already racialized and gendered.

What a theory of media orientations can add to this existing McLuhanist scholarship on extensions is to help scholars think about how media not only extend the white male body but also orient all bodies in space. It allows us to think through how bodies respond to the straightening pressure of media to be oriented in a certain way, and what happens to those bodies that are not extended but are still very much present in media. Like the work done at a table, what we do with media is not incidental but essential to the nature of the media itself (Heidegger, 1999; Ahmed, 2006). A table, by virtue of the activities done upon and because of it, is not simply a flat surface or dumb object but essentially a writing table or a dining table. A table a thus a “tool, ... something we do something with,” [3] yet it also allows bodies to do certain things and not others. A writing table extends a “writing body.” [4] Tools, like objects and spaces, extend certain bodies and not others. Seen this way, tools are not simply objects that we do things with but also objects that do things to us. To see media and digital networks as tools in this sense is to see them as not simply objects in a room but as orientating devices that turn us towards particular ways of being and doing. To recast the machine not as a tool or as a prosthetic is to rethink the machine’s ontology.

In their twinned use of “extension,” McLuhan and Ahmed find themselves on perhaps unexpected common ground. Ahmed’s phenomenology reminds us that “spaces are not exterior to bodies; instead, spaces are like a second skin that unfolds in the folds of the body.” [5] If media are prostheses, then they are not one-size-fits-all. Following Ahmed, I argue that digital spaces, like “physical” spaces, are not exterior to bodies. Instead, media direct us in some ways and not in others. For those bodies that media comfortably extend, the given path of use is smooth and often largely imperceptible — it may read not as a narrow path at all but as an open field. In Louis Althusser’s (2006) famous account of the policeman’s hail, the subject is hailed by the police and turns, recognizing themselves in the “you” of “hey you!” He comments, perhaps innocuously, that “nine times out of ten” the individual hailed is the correct one, and turns [6]. Yet what of the final one, the one who fails to turn? In the sea of media tracking, extracting, and recording us, the people being hailed by the policeman are “now patterns of life extracted” from data [7]. As Hito Steyerl (2019) argues, the task of pattern finding, of sifting out the noise from the signal or “cleaning” data, is the new interpellation. For those bodies that resist orientation, that are deemed “dirty,” media leave a disorientating feeling in their wake; we encounter these machines differently and are slow to turn when hailed, if we turn at all. Indeed, such a resistance or failure to be oriented correctly is what allows us to be cognizant of the orienting pressure at all. In spatial terms, we might say that the size of a room becomes most apparent when we are pressed against the walls. Similarly, only by looking to how some bodies don’t fit the shape of the media prosthetic can we see the orientating force of media at all. If McLuhan’s Understanding media, as Jody Berland writes, “elaborates and illustrates the idea that each medium absorbs and extends our body and our attention in specific ways,” [8] Ahmed reminds us that this “our” is a pluralistic pronoun. For some bodies, the extension is a comfortable fit. For others, it is a straightening pressure, which brings with it a sense of fatigue. As Ahmed writes, privilege is an “energy saving device.” [9] Media might extend some bodies and, by extending forms of privilege, deplete others — though, importantly, this depletion is not a foreclosure. As I discuss further at the end, that technology orient us is not to say that we cannot nevertheless engage in what Fritsch, et al., in their discussion of crip technoscience, call “a kind of critical world-building.” [10]

 

++++++++++

III. Public spaces and exposure

Though Ahmed’s phenomenology opens up space for thinking of media in a variety of ways, I have chosen to narrow my focus to the ways that notions of publicity and privacy come to be constituted online. In his seminal essay, “The public sphere,” Jürgen Habermas (1974) contrasts the public sphere, to which by definition all citizens have access, with both the private sphere and the state more generally. The public sphere can thus be understood as the generally acceptable or inoffensive. But as black feminist writers like Brittney Cooper have explored, Black women routinely engage in a “culture of dissemblance” [11] by seeming to share while in fact hiding the personal and the private. Unlike the idealized bourgeois subject of Habermas’ public sphere, racialized subjects do not have the privilege of a neat separation of public and private lives, instead being subject to police violence at intersections as well as the intersectional clashes of personal identity and public politics more broadly. Feelings of depletion might occur when we feel our bodies are not welcome to ostensibly public and publicly available spaces. By looking at the public, it becomes more easily apparent which bodies media have extended, and which they have relegated to the background. Furthermore, because conversations of private and public spheres often become debates over representation and participation, an orientation-minded media theory is useful. Rather than considering which bodies have been represented and locating the teleology of struggle in more (and therefore better) representation, the question becomes instead who is invisible in the very orientation of the machine. The locus of power is found in the technology itself; media is not incidental. Making facial recognition software more capable of identifying black faces will not change the technologys current orientation towards policing and surveillance (Pasquale, 2018). Having more bodies of colour will only mean more bodies in shadow.

Looking specifically at how private and public spaces are formed online, we can see media extend certain bodies in the digital public sphere, while others (those that are queer, racialized, or otherwise non-normative) are coded as private. By and large, digital networks orient users towards going public, as they exalt notions of openness and inclusion. This attitude is typified by Facebook founder Mark Zuckerberg’s exhortations towards disclosure, particularly as he has repeatedly invoked transparency in order to defend Facebook’s privacy policies (Gregg, 2013). As Tarleton Gillespie has argued, algorithms function as a “technology of the self” [12] that assure us that we do indeed exist in public. Invisible yet profoundly perceptible, algorithms and digital networks more broadly seem to work unproblematically and normalize the information they present as “‘right,’ as right as [their] results appear to be.” [13] Algorithms thus not only affirm that we exist but bring the public sphere into being in the process. While for some, however, to appear in a search engine’s results is to receive reassuring proof of being, for others, the results are chilling. As Safiya Noble (2018) discusses in Algorithms of oppression, Google’s results when she searched for Black women and girls were hardly affirming. Noble’s revelation that, before undertaking the project, she had learned to take these results as a given, that she felt she had entered a tacit bargain with media and had allowed herself to be turned by them too, is a powerful demonstration of how often media’s orientations remain invisible even to those most harmed by them. To be brought into existence in the public sphere can be painful, just as media’s orientations towards disclosure can become a demand. This orientation can be a straightening pressure that says to be online is to have nothing to hide. If transparency is an assurance of trustworthiness, then a failure to be visible, or a desire to be opaque, is a tacit admission of deception. Media thus orient us towards openness; they are, in Wendy Chun’s words, “leaky,” [14] and they demand that we be leaky too. The consequences of this are as material, and as gendered, as the technologies themselves.

A timeline, as an example: in 2000, an opinion article is published in a small newspaper in British Columbia, Canada, describing an incident of harassment between the anonymous writer and Justin Trudeau, at the time known only as the son of a past prime minister. In 2018, a tweet of the op-ed linking it to the #MeToo movement goes viral. The orientation of digital media towards openness and disclosure is immediately and starkly evident in the ensuing flurry. As days pass and the unnamed writer refuses to go on the record, her refusal to leak becomes by extension proof of her deceit. The writer’s silence, their opacity, is a sign of shame. Although they might seem to be at the centre of the controversy, media users are in many ways oriented away from them, and away from their desires. It is certainly not the writer’s body that is extended into this digital space, as they fail to turn to the machine’s hail or react appropriately by going public. Finally, they were tracked down. Their name was printed. They became public (Tasker and Laventure, 2018). This final disclosure, however, is nonetheless a failure to orient oneself appropriately because it was too slow. Once you have refused to go public, any future step into the spotlight is too reluctant to read as satisfactorily transparent.

In its exhortation towards transparency, digital media demand that we have nothing to hide. Such an orientation towards openness does not affect all bodies equally. To put online and to make public queer activities, queer gatherings, and/or queer participants carries certain, sometimes fatal, risks. Indeed, to think that there might be digital content that is “nominally public and not-for-public-use” [15] is to adopt a queer orientation towards digital spaces, and to refuse to fall in line with media’s demand for openness. Thinking again of the way that bodies are implicated in orientations, we might think of transparency as having a certain embodied quality to it, a thinness of being or lack of barrier between oneself and others. For some, such lack of barriers can be dangerous. To remain anonymous is to be a liar, but to be public is to risk harassment and threats of violence. For women, people of colour, LGBTQ+ people and others, trolling can be a particularly effective straightening device, a way of exerting pressure on those who are unwelcome and erasing “their bodies from public dialogue and spaces.” [16] One becomes afraid to see oneself online.

 

++++++++++

IV. Private spaces and exclusion

Media’s orientation towards publicity, however, can equally be one of creating public spaces through exclusion. Like pushes towards exposure, such exclusions occur on intersectional lines. Wendy Chun’s (2016) investigation of the ways that an ideology of transparency and openness results in the enclosure and closing out of certain bodies helps inform a media phenomenology: the public and the private can be seen as mutually constituted “spheres” in which certain bodies are extended and others are straightened away. Chun observes that the logic of the public space is “perverse: in order to make the public space safe for middle-class women, we need to remove both these women and ‘dubious’ men from these spaces, so that public spaces become the reserve of privileged men.” [17] Similarly, calls for greater participation online that are in line with a belief that greater and more complete openness is democratic often paradoxically entail telling those who fear harassment to simply be quiet. Participatory logic thus falls in line with a larger orientation towards public spaces as essentially masculine, white, cis, heterosexual (Vickery, 2018). By locating the issue of harassment as a technological one, moreover, a theory of media orientations highlights that more participation will do little to change how and where media turn the body. Though sexual orientation might seem a far cry from the labour of removing offensive content from digital platforms, the site’s orientation functions in much the same way as compulsory heterosexuality in its orientation away from queerness.

Indeed, this is literalized in the removal of queer bodies from digital space. In conjunction with constructing masculine public spaces through the threat of violence (verbal or otherwise), the hetero-patriarchal logics at the heart of everyday digital technologies qualify certain bodies as private — or more specifically, as unsafe for public viewing — while others are acceptable. Instagram’s policies on appropriate content are prime examples of media’s orientation towards the public, as parts like “female nipples” are explicitly named as falling under the category of sexually suggestive content, while the nipples of other gendered bodies go unnamed (Instagram, 2020). Algorithmic content policing is insidious precisely because it is both invisible and deliberate; as “female” nipples are systematically removed from the digital public sphere, these illicit bodies and body parts do not even leave empty spaces online to which users could point. The digital policies of platforms like Instagram, Twitch, or YouTube that deem what is appropriate content serve as one kind of orientating force or “straightening device.” As they straighten certain kinds of bodies out of the public sphere, they orient users towards an understanding of what the public sphere means and who deserves to be there. Again, media orientations are felt as an embodied experience: a platform’s hail is a mere call to some users, but it is a painful blow to others.

Though much of the language around them mystifies the embodied and material spaces that they are, digital spaces are nevertheless ones in which certain bodies are called in, and others are turned away, marked as illicit or inappropriate. As feminist newsletter Salty (2019) found, queer users and people of colour are policed at a higher rate on Instagram for supposedly “inappropriate content.” Accusations of “shadow banning,” or algorithmic policing that makes certain posts less likely to appear in followers’ feeds, against Instagram equally demonstrate the ways that media platforms extend certain bodies into the public sphere (Joseph, 2019). Shadow banning provides a particularly evocative example of the ways that media can function as orientating devices in ways that are entirely opaque to the user. Like other platforms, Instagram’s algorithms are proprietary, leaving users, journalists, and academics to merely speculate as to their functioning, or to look at their effects in order to extrapolate the algorithms from there (Noble, 2018). Users who rely on platforms for monetary gain are particularly vulnerable to their underlying orientations, as Instagram influencers have attempted to game the system by trying to perceive its algorithms’ underlying logics and follow them accordingly (Cotter, 2019). Though the existence of shadow banning remains debated, even the fear of having one’s content quietly removed from their followers’ feeds materially shapes how users interact with the platform, down to the hashtags used or content posted. In response to (perceived) algorithmic control, users may refrain from posting “sexually suggestive” content, diversify the hashtags they use so as not to appear as spam, or screen followers in an attempt to limit the possibility of being reported [18].

Within these public platforms on which users are shown the content they “care about the most,” the bodies that are near-at-hand are white, cis, and “appropriately clothed.” Viewing media as orientating devices, then, helps reveal how they structure not only how they are used but how broader concepts — like public and private, appropriate and inappropriate — are materially shaped by media and our interactions with them. As a tool, shadow banning and the discourse platforms deploy to denigrate users who “game the system” point to the “ideal of being seen[19] that governs so much of digital media. Algorithms’, and media’s more broadly, invisible yet profoundly perceptible logics function in much the same way as the heteronormative, patriarchal, and racist logics that Ahmed (2006) explores in Queer phenomenology, and I argue media cannot be easily separated from these logics themselves, particularly as women and people of colour find themselves bearing the brunt of such orientating pressures. As a consequence that is felt but never openly acknowledged by platforms, shadow banning does indeed reduce such bodies to shadows.

 

++++++++++

V. Orientations and deliberate design

Scholars have pointed to the ways that content moderation invisibly fixes the bounds of acceptable speech (Gillespie, 2018; Myers West, 2018), orienting us towards particular ways of being and speaking. Platforms proudly highlight the content that they make available, but rarely pay mention to the content that they remove, as well as how and when they intervene (Gillespie, 2018). Content moderation, however, is often enforced not solely by algorithms but by working human hands. As an embodied space, a digital forum — or a comment section on a video — functions much like a kitchen table. As Ahmed writes, “[b]eing orientated toward the writing table not only relegates other rooms in the house to the background, but also might depend on the work done to keep the desk clear [emphasis in original].” [20] Orientations can extend, but they can also obscure. Like the gendered labour of keeping the table clean, usable, and useful, the work of clearing the platforms of digital debris often goes unnoticed. Likewise, the one who does the work of ensuring that the writing-table reminds a space of intellectual labour, and that no dishes are placed there instead, does necessary work that maintains the writing-table’s ontology. Sustaining an orientation “might depend on such work, while it erases the signs of that work, as signs of dependence.” [21] What orientation does a YouTube comment section, for example, sustain? Perhaps it orients us towards conversation and assures us that civil discourse and deliberation are possible, and that offensive comments can always be flagged and deleted. A comment section is an open space, perhaps the ultimate Habermasian public sphere. Data remains online for years, assuring us that such moments of speech will and can live on forever. YouTube describes itself as a space of “original content” that you can “share,” all of which participate in an orientation towards publicity and openness. And yet, like the writing table, shadowy labour is undertaken to keep the surface clean.

Commercial content moderators are a particularly evocative example of the poorly paid digital janitors who do the dirty work of keeping the Internet clean (Roberts, 2019). While algorithms and the Internet more broadly may seem to simply work, they work only because of the labourers who screen thousands of pieces of content a day. Like tables, digital spaces do not simply happen to be clean; instead, much necessary but invisible work is needed to make and keep them that way. Commercial content moderation, like feminized housework, must be invisible, for the orientation that digital media demand of us can be sustained only if digital spaces seem naturally public and essentially “safe”. Like compulsory heterosexuality, the orientation of the Internet towards openness seems entirely natural. Turning towards the shadowy bodies that are not extended but nevertheless undergird and make possible the conditions of digital media, then, is not “simply” an ethical project but one that contests the logics of digital media themselves. Otherwise stated, an orientation away from the marginalized and feminized work that sustains digital media is not a side effect of digital platforms but one that is fundamentally necessary to the success of these platforms, and to the ideology that such platforms and spaces come to be all on their own. Similarly, media orientations towards automation and seemingly mechanized work orient us away from the invisible and unpaid or poorly paid work that in fact takes place (Taylor, 2018). The lie of automation obfuscates real human labour under the futuristic veneer of robotic work. Amazon’s slogan for its Mechanical Turk workers, “artificial artificial intelligence,” is almost surprising in how self-aware it is: the slogan glibly acknowledges that an orientation away from its workers is by design.

There are no visible lacunae in which flagged content once sat, and no credit for the artificial “tools” that make up a crowdsourced final product. Digital janitors leave behind no mops, no brooms, no streaks on the glass. These material and deliberate actions of shaping the Internet are instead disguised, making the state of the Internet as a public forum, and the content on it as acceptable, become natural and unquestioned.

 

++++++++++

VI. Endings, openings

While I have focused on digital media’s orientations, particularly in relation to the shaping of public and private spaces, Ahmed’s phenomenology remains widely applicable to media more generally, and offers news ways of thinking of the ways that media orient the bodies of their users. A theory of orientation reminds us that to be oriented is a physical experience, not a prosthetic extension but a physical turning both to and away. Moreover, a phenomenology of media allows us to see whose bodies we are turned away from, and what potentials reside in not being seen by the machine. While the feeling of being oblique in a straight room — or brown in a white room — is uncomfortable, Ahmed and a long line of feminist and decolonial scholars have pointed to the new forms of activism and possibility that come into view when we notice how our body has been held up.

Having media oriented towards you and being seen and see-able on a platform bring with them a sense of power, but a platform is a limited space. Like the table, a platform is a space that will always have people beneath and behind oneself. If a platform is one that “demands exclusive and normative spatial arrangements in the name of resistance,” [22] that demands forms of refusal that are legible to the machine, then we must ask what we can do in the shadows. If media have orientations, then simply having more people of colour represented in media, or more women on Instagram, will do little to change the machine’s orientation. Some bodies will remain in the background. Machines are hardly tools that can be used by anyone; rather, in the very moment of use, media already matter. If media straighten, obscure, orient us away from those that supply the very conditions of possibility for it to function, then perhaps we must orient ourselves towards the shadows instead, engaging in what Steyerl evocatively calls “defiant apophenia” [23] in the face of machines’ orientations and pattern recognition. Adapting and adopting Ahmed’s call to allow queer orientations to “gather” at the end of Queer phenomenology, I resist calling for more women of colour, more queer bodies, more diversity in those digital gathering spaces. Instead, I ask: what might it mean to be hailed by the machine, and turn away? End of article

 

About the author

Nelanthi Hewa is a Ph.D. student at the University of Toronto’s Faculty of Information.
E-mail: nelanthi [dot] hewa [at] mail [dot] utoronto [dot] ca

 

Notes

1. Chun, 2016, p. 26.

2. Ahmed, 2010, p. 247.

3. Ahmed, 2006, p. 45.

4. Ahmed, 2006, p. 46.

5. Ahmed, 2006 p. 9.

6. Althusser, 2006, p. 86.

7. Steyerl, 2019, p. 2.

8. Berland, 2019, p. 570.

9. Ahmed, 2013, paragraph 8.

10. Fritsch, et al., 2019, p. 4.

11. Cooper, 2018, p. 101.

12. Gillespie, 2014, p. 186.

13. Gillespie, 2014, p. 188.

14. Chun, 2016, p. 12.

15. Cowan and Rault, 2018, p. 131.

16. Vickery, 2018, p. 42.

17. Chun, 2016, p. 159.

18. Cotter, 2019, p. 904; Olszanowski, 2014.

19. Petre, et al., 2019, p. 2.

20. Ahmed, 2006, p. 30.

21. Ahmed, 2006, p. 31.

22. Singh and Sharma, 2019, p. 302.

23. Steyerl, 2019, p. 17.

 

References

Louis Althusser, 2006. “Ideology and ideological state apparatuses (notes towards an investigation),” In: Meenakshi Gigi Durham and Douglas M. Kellner (editors). Media and cultural studies: Keyworks. Revised edition. Malden, Mass.: Blackwell, pp. 79–87.

Sara Ahmed, 2013. “Feeling depleted?” feministkilljoys (17 November), at https://feministkilljoys.com/2013/11/17/feeling-depleted/, accessed 26 January 2021.

Sara Ahmed, 2010. “Orientations matter,” In: Diana Coole and Samantha Frost (editors). New materialisms: Ontology, agency, and politics. Durham, N.C.: Duke University Press, pp. 234–257.

Sara Ahmed, 2006. Queer phenomenology: Orientations, objects, others. Durham, N.C.: Duke University Press.

Jody Berland, 2019. “McLuhan and posthumanism: Extending the techno-animal embrace,” Canadian Journal of Communication, volume 44, number 4, pp. 567–584.
doi: https://doi.org/10.22230/cjc.2019v44n4a3725, accessed 26 January 2021.

Wendy Hui Kyong Chun, 2016. Updating to remain the same: Habitual new media. Cambridge, Mass.: MIT Press.

Brittney Cooper, 2018. “Bag lady,” In: Brittney Cooper. Eloquent rage: A Black feminist discovers her superpower. New York: St. Martin’s Press, pp. 99–124.

T.L. Cowan and Jasmine Rault, 2018. “Onlining queer acts: Digital research ethics and caring for risky archives,” Women & Performance, volume 28, number 2, 121–142.
doi: https://doi.org/10.1080/0740770x.2018.1473985, accessed 26 January 2021.

Kelly Fritsch, Aimi Hamraie, Mara Mills, and David Serlin, 2019. “Introduction to special section: Crip technoscience,” Catalyst: Feminism, Theory, Technoscience, volume 5, number 1, https://catalystjournal.org/index.php/catalyst/article/view/31998, accessed 26 January 2021.
doi: https://doi.org/10.28968/cftt.v5i1.31998, accessed 26 January 2021.

Tarleton Gillespie, 2018. Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. New Haven, Conn.: Yale University Press.

Tarleton Gillespie, 2014. “The relevance of algorithms,” In: Tarleton Gillespie, Pablo J. Boczkowski, and Kirsten A. Foot (editors). Media technologies: Essays on communication, materiality, and society. Cambridge, Mass.: MIT Press, pp. 167–193.
doi: https://doi.org/10.7551/mitpress/9780262525374.001.0001, accessed 26 January 2021.

Melissa Gregg, 2013. “Spouse-busting: Intimacy, adultery, and surveillance technology,” Surveillance & Society, volume 11, number 3, pp. 301–310.
doi: https://doi.org/10.24908/ss.v11i3.4514, accessed 26 January 2021.

Jürgen Habermas, 1974. “The public sphere: An encyclopedia article,” New German Critique, number 3, pp. 49–55.
doi: https://doi.org/10.2307/487737, accessed 26 January 2021.

Martin Heidegger, 1999. Ontology: The hermeneutics of facticity. Translated by John van Buren. Bloomington: Indiana University Press.

Martin Heidegger, 1962. Being and time. Translated by John Macquarrie and Edward Robinson. New York: Harper.

Mél Hogan, 2018. “Big data ecologies,” Ephemera: Theory & Politics in Organization, volume 18, number 3, pp. 217–243.

Tung-Hui Hu, 2015. A prehistory of the cloud. Cambridge, Mass.: MIT Press.

Edmund Husserl, 1982. Ideas pertaining to a pure phenomenology and to a phenomenological philosophy: First book: General introduction to a pure phenomenology. The Hague, Netherlands: Martinus Nijhoff.

Instagram, 2020. “Community guidelines,” at https://help.instagram.com/477434105621119, accessed 26 January 2021.

Chanté Joseph, 2019. “Instagram’s murky ‘shadow bans’ just serve to censor marginalised communities,” Guardian (8 November), at https://www.theguardian.com/commentisfree/2019/nov/08/instagram-shadow-bans-marginalised-communities-queer-plus-sized-bodies-sexually-suggestive, accessed 26 January 2021.

Maurice Merleau-Ponty, 1962. Phenomenology of perception. Translated by Colin Smith. New York: Routledge.

Marshall McLuhan, 1964. Understanding media: The extensions of man. New York: McGraw-Hill.

Sarah Myers West, 2018. “Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms,” New Media & Society, volume 20, number 11, pp. 4,366–4,383.
doi: https://doi.org/10.1177/1461444818773059, accessed 26 January 2021.

Magdalena Olszanowski, 2014. “Feminist self-imaging and Instagram: Tactics of circumventing sensorship,” Visual Communication Quarterly, volume 21, number 2, pp. 83–95.
doi: https://doi.org/10.1080/15551393.2014.928154, accessed 26 January 2021.

Frank Pasquale, 2018. “Odd numbers,” Real Life (20 August), at https://reallifemag.com/odd-numbers/, accessed 26 January 2021.

John Durham Peters, 2015. The marvelous clouds: Toward a philosophy of elemental media. Chicago: University of Chicago Press.

Caitlin Petre, Brooke Erin Duffy, and Emily Hund 2019. “‘Gaming the system’: Platform paternalism and the politics of algorithmic visibility,” Social Media + Society (4 November).
doi: https://doi.org/10.1177/2056305119879995, accessed 26 January 2021.

Sarah T. Roberts, 2019. Behind the screen: Content moderation in the shadows of social media. New Haven, Conn.: Yale University Press.

Salty, 2019. “Exclusive: An investigation into algorithmic bias in content policing on Instagram,” at https://saltyworld.net/algorithmicbiasreport-2/, accessed 26 January 2021.

Saskia Sassen, 1991. The global city: New York, London, Tokyo. Princeton, N.J.: Princeton University Press.

Sarah Sharma, 2017. “Exit and the extensions of man” (8 May), at https://transmediale.de/content/exit-and-the-extensions-of-man, accessed 26 January 2021.

Rianka Singh and Sarah Sharma Sharma, 2019. “Platform uncommons,” Feminist Media Studies, volume 19, number 2, pp. 302–303.
doi: https://doi.org/10.1080/14680777.2019.1573547, accessed 26 January 2021.

Hito Steyerl, 2019. “A sea of data: Pattern recognition and corporate animism (forked version),” In: Clemens Apprich, Wendy Hui Kyong Chun, Florian Cramer, and Hito Steyerl (editors). Pattern discrimination. Minneapolis: University of Minnesota Press, pp. 1–20.
doi: http://dx.doi.org/10.25969/mediarep/12348, accessed 26 January 2021.

John Paul Tasker and Lisa Laventure, 2018. “Woman who accused Trudeau of groping breaks her silence,” CBC News (6 July), at https://www.cbc.ca/news/politics/woman-accused-trudeau-breaks-silence-1.4737511, accessed 26 January 2021.

Astra Taylor, 2018. “The automation charade,” Logic (1 August), at https://logicmag.io/failure/the-automation-charade/, accessed 26 January 2021.

Armond Towns, 2019. “The (black) elephant in the room: McLuhan and the racial,” Canadian Journal of Communication, volume 44, number 4, pp. 545–554.
doi: http://dx.doi.org/10.22230/cjc.2019v44n4a3721, accessed 26 January 2021.

Jacqueline Ryan Vickery, 2018. “This isn’t new: Gender, politics, and the Internet,” In: Jacqueline Ryan Vickery and Tracy Everbach (editors). Mediating misogyny: Gender, technology, and harassment. Cham, Switzerland: Palgrave Macmillan, pp. 31–49.
doi: https://doi.org/10.1007/978-3-319-72917-6_2, accessed 26 January 2021.

Moiro Weigel, 2018. “Silicon Valley’s sixty-year love affair with the word ‘tool’,” New Yorker (12 April), at https://www.newyorker.com/tech/annals-of-technology/silicon-valleys-sixty-year-love-affair-with-the-word-tool, accessed 26 January 2021.

 


Editorial history

Received 1 August 2020; revised 28 September 2020; accepted 28 September 2020.


Copyright © 2021, Nelanthi Hewa. All Rights Reserved.

When the machine hails you, do you turn? Media orientations and the constitution of digital space
by Nelanthi Hewa.
First Monday, Volume 26, Number 2 - 1 February 2021
https://firstmonday.org/ojs/index.php/fm/article/download/10978/10075
doi: https://dx.doi.org/10.5210/fm.v26i2.10978