First Monday

Not robots; Cyborgs  Furthering anti-ableist research in human-computer interaction by Josh Guberman and Oliver Haimson

This theoretical essay builds on existing literature to draw out the consequences of dehumanizing and disseminating autism discourses within the field of human-computer interaction (HCI). Focusing mainly on narratives in HCI that frame autistic people as or like machines, we explore how dominant constructions of autism in HCI work to normalize the field’s complicity in violent autism intervention paradigms, despite HCI researchers’ well-meaning intentions. We work towards developing crip-cyborgs as an alternative framework for understanding autistic people (as opposed to computers or robots) and suggest crip technoscience as a framework for research based on this alternative understanding. In doing so, we hope to enroll misguided but well-intentioned researchers in dismantling anti-autistic ableism, both in and beyond HCI.


“Natural affinities”
Robot(ic) autism therapy
Not robots; Cyborgs




As Rottier, et al. (2022) and Williams (2021) explore, many human-computer interaction (HCI) researchers insist upon the likeness of autistic [1] people to computers, robots, and other machines (e.g., Weizenbaum, 1983). That is, autistics are cast as and compared to things and objects, rather than people research at the intersection of autism and technology, much of which focuses on behavioral interventionism (previously explored by Spiel, et al., 2019; Williams and Gilbert, 2020), often claims that autistics have a “natural affinity for technology” (e.g., Caria, et al., 2018; Lavian and Alshech, 2015; Navedo, et al., 2019; Valencia, et al., 2019). In this theoretical essay, building on related works (e.g., Rottier, et al., 2022; Williams, 2021), we explore the history, motivation for, and effects of defining autistics as mechanistic, “computeristic” [2], and robotic descriptions of autism. Williams (2021) and Yergeau (2018) demonstrate that characterizations of autism are not unique to or new within HCI (e.g., Weizenbaum, 1983). Similarly, these characterizations are not unique in drawing rhetorical divisions between autistic people and other humans. As Yergeau (2018, 2013, 2010) explains, common public and psychiatric autism narratives categorize autism as apart from humanity.

We argue that narratives about the similarities between autistic people and computers, like those popular within HCI, go a step further: in positioning autistics alongside, like, and less than machines, HCI constructs autism as a category apart from the realm of the living. We hold that this classification of autistic people as things is critical to understanding some forms of anti-autistic violence. Specifically, we relate these narratives to violence in certain strains of autism research in HCI (see Rottier, et al., 2022; Williams and Gilbert, 2020). These narratives contribute to political economies contingent upon an unceasing supply of autistic bodies, which support and justify the existence of a cottage industry of autism professionals (Broderick and Roscigno, 2021; Moore, 2021). These political economies, in turn, explain why these strains of autism research persist, despite calls for autism researchers to reevaluate their commitments (e.g., Kender and Spiel, 2022; Rottier, et al., 2022; Spiel, et al., 2018; Williams and Gilbert, 2019; Ymous, et al., 2020). We add our essay to this growing corpus calling for change. Drawing on crip-cyborg (cripborg; see Nelson, et al., 2019) politics (Earle, 2019; Hamraie and Fritsch, 2019; Kafer, 2013; Weise, 2016; Williams and Gilbert, 2019), we attempt to resituate autistics within the realm of the living. In arguing that autistics, despite differences from other people, are entitled to research for and with (rather than against or on) them, we highlight crip-technoscience (Hamraie and Fritsch, 2019) as a framework for valuable and liberatory HCI research.



“Natural affinities”

Yergeau writes, “In all things discursive, autism represents decided lack” [3]. Traditional autism discourses frame autism as a void once occupied by a “normal” child (see Sinclair, 2012b). Clinical discourses routinely identify autism as a deficit and departure from humanness (see Astle and Fletcher-Watson, 2020). Clinicians widely apply pre-existing psychological frameworks to autism. For example, frameworks like theory of mind (ToM; see, e.g., Baron-Cohen, 1995; Baron-Cohen, et al., 1985; Frith, 1989), borrowed from broader usage in clinical psychology and applied to autistic bodyminds, characterize autism as the absence of essential human faculties. ToM is a theoretical cognitive mechanism ostensibly accounting for empathy, recognizing the existence of others’ minds, knowledge of one’s own mind, and having a mind able to be known (Baron-Cohen, et al., 1997; Yergeau, 2018). As Yergeau (2013) explains, in defining ToM as essential to humanness and describing the lack of ToM as fundamentally characteristic of autism, ToM draws a line between autism and humanity. Deficit-oriented pathological discourses that explicitly position likeness with computers and other technological artifacts (e.g., typewriters) as fundamental to autism go a step further. Such discourses remove autistics from not just humanity but the category of living things altogether (Rottier, et al., 2022).

Discourse about a likeness between autistics and robots is not new (e.g., Weizenbaum, 1983) and not limited to HCI (see Williams, 2021; Yergeau, 2018). Robotic characterizations of autistic people long predate the existence of technologically mediated autism interventions, computers, and HCI as a field of study. Yergeau (2018) traces robotic descriptions of autism to Bettelheim’s 1959 article, “Joey: A ‘mechanical boy’”. An influential historical figure in the construction of autistic pathology, Bettelheim described his subject Joey as an unskilled infant but a successful machine (Bettelheim, 1959). Bettelheim recorded Joey as articulating a fondness for machines, and even claiming to be a machine himself as a means to cope with and deny the emotions associated with human experience (Bettelheim, 1959). Williams (2021) notes that, as an adult, Joey denied a recollection of these claims. Bettelheim indicates that comparisons between machinery and autistics precede his writing about Joey, noting that “many other autistic children are fascinated by rotation and things mechanical” [4]. Williams (2021) articulates that cultural tropes about robot-like autistics lack a single identifiable point of origin but likely emerge from descriptions used in some of the earliest clinical writings about autism.

Autistics-as-machines tropes have existed within modern computing for decades, going as far back as some of Weizenbaum’s writing in 1974 (see Weizenbaum, 1983). In the early days of interactive computing, computer scientists debated whether computers understood human users, or merely provided an illusion of understanding (Suchman, 2006; Weizenbaum, 1966). Arguing the computer’s capacity for understanding was illusory, Weizenbaum (1983) suggested against reading too deeply into computers’ programmed responses to stimuli. He explained that typewriters, too, responded to stimuli, typing letters in response to keypresses, but surely typewriters do not understand the words they imprint. He described interactive computer programs, typewriters, and “infantile” autistics as having equivalent capacities for genuine conversation (i.e., none whatsoever; see Weizenbaum, 1983).

Since Weizenbaum’s days, like with broader cultural robot/autism tropes, mechanistic characterizations of autism in HCI have become diffuse, lacking an identifiable origin. The prevalence and staying-power of this trope in HCI are evident in numerous repetitions of the claim that autistics “have a natural affinity for technology” (Boitnott, 2012; Brandão, et al., 2015; Caria, et al., 2018; Carvalho, et al., 2015; Dees, 2021; Elliott, 2017; Frauenberger, et al., 2015, 2013, 2012; Guldberg, et al., 2017; Lavian and Alshech, 2015; Majeed, 2017; Shamsudin, et al., 2021; Menzies, 2011; Navedo, et al., 2019; Newbutt and Bradley, 2022; Valencia, et al., 2019) [5]. Few citations for this phrase (if any citation is given) provide supporting evidence. More often, the citations provided point to similar works. Those similar papers either justify technological interventions on assumptions of affinity themselves or find that a small-sample or preliminary study showed promising, if inconclusive, results about a specific technological intervention (e.g., Parsons, 2015; Williams, et al., 2002; Wojciechowski and Al-Musawi, 2017). Paradoxically, some citations point to papers suggesting technologically-mediated autism interventions lack foundational evidence (e.g., Fletcher-Watson, 2014). These citational practices raise questions about standards within HCI for sourcing claims, particularly those purporting an entire category of people possess some essential characteristic.

Chasing these “natural affinity” claims in HCI through a deep citational rabbit hole [6], we eventually found some theoretical support for technological affinities as an essentialized characteristic of autism (Murray and Lesser, 1998b, 1998a). This theory, monotropism, refers to the autistic tendency to focus incredibly intently on a single interest (Murray, 2018). While not commonly cited directly, most “natural affinity” claims we were able to trace to a primary source pointed to this theory. Murray (2011) positions monotropism in contrast to polytropism, the non-autistic tendency to divide energy across many interests (Murray, 2012). Murray and Lesser (1998b, 1998a) build their theory of monotropism on evolutionary theory and mathematical modeling of atypical biological survival mechanisms. They argue that the theoretical distinction between autistics and non-autistics is natural — something biological and essential to autism. They say that, like autistic people, computers (at least in 1998) are “naturally monotropic,” singly-focused machines (Murray and Lesser, 1998a). Over two decades, Murray promoted monotropism as a mathematically and biologically grounded framework for autistic difference (Murray, 2018; Murray and Lesser, 1998b). In their work, Murray and Lesser (1998b, 1998a) clearly explain that, by positioning monotropism as biological, they intend to frame autism as a natural variation of humanity rather than something apart from it. We want to clarify that we support this intention of normalizing autistic existence as part of a broad spectrum of what it means to be human. Murray and Lesser (1998a) present monotropism as a way for non-autistics to better understand and build relationships with autistic people, which is highly laudable. The problem arises when this research is uncritically used to support claims about technological affinities as an essential characteristic inherent to all autistic people. While essentializing claims about autistics and technology remain popular, monotropism remains purely theoretical, having some explanatory benefits over alternative frameworks but lacking meaningful supporting evidence or empirical work (Milton, 2017; Wood, 2021). Through its commitment to this claim about autistic technological affinities, or techno-autism, HCI perpetuates cultural narratives that position autistics as things, not people.




Robot interventions for autism represent a flourishing niche within HCI (Spiel, et al., 2019), wherein research is often predicated upon claims that autistic people are more similar to and prefer machines over people. Suchman (2011) describes robots in human-robot interaction research as model (in)organisms. Model (in)organisms play on the concept of model organisms, which refers to using a particular creature as a proxy for scientific experiments trying to answer questions about other creatures or broader biological phenomena (see, e.g., Kohler, 1991). For Suchman (2011), robots provide a model that approximates humans — one that can pose as a substitute for people in experiments about human phenomena.

Suchman (2011) explores how, through robot model (in)organism research, our understanding of human behavior and cognition is informed by tests on and with robots. As we gain a deeper understanding of humans, our understanding of robots (as human proxies) likewise adapts. Roboticists design robots to behave like humans. Humanoid robot design requires complex computer modeling intended to mimic human cognition. When robots respond to stimuli in expected — humanlike — ways, researchers stake it to indicate their models’ validity. Our understanding of human cognition, then, is reorganized around robotic algorithms. As robots behave more humanlike, our understanding of humans becomes more robot-like (Suchman, 2011). Understandings of humans and robots become recursive and co-constitutive (Suchman, 2011).

Robots and computers never quite match their human counterparts. There remain critical gaps between humans and robots (Mori, 1970), despite the ways our understanding of one is co-produced (see Jasanoff, 2004) with and by our understanding of the other (Suchman, 2011). Critical gaps between humans and computers reside in robots’ peculiar ways of moving (Mori, 1970) and limited capacity for mutual intelligibility with human users (Suchman, 2006). The disjunctures between humans and robots are the same as those observed or alleged between humans and autistics (Baron-Cohen, 1995; Weizenbaum, 1983), leading to Weizenbaum’s comparison of computers and autistics. More recently, Kaminka (2013) stated that nearly all robots are autistic. In understanding robots as autistic, it seems inevitable that notions of autistics (replete with natural affinities for technology) as robots follow. As we show next, such characterizations can harm autistic people in technologically mediated autism research.



Robot(ic) autism therapy

Socially assistive robot therapy represents an extension of the vast, pre-existing surveillance and control apparatuses (Roscigno, 2019; Williams, 2021; Yergeau, 2018) endemic to these interventional frameworks. Robots are deployed in robot-intervention research for autism to assist in or augment formalized autism intervention paradigms, such as applied behavioral analysis (ABA) (Spiel, et al., 2019; Williams and Gilbert, 2020). These interventions seek to rehabilitate autistic bodyminds, rendering them indistinguishable from their peers (Lovaas, 1987). As Yergeau (2018) explains, a key figure in ABA and autism, Ole Ivar Lovaas, is notorious for his involvement in the Feminine Boys Project. Through his involvement with the Feminine Boys Project, Lovaas created and refined gay conversion therapy using ABA. ABA for autism uses the same behaviorist interventions as conversion therapy.

More specifically, through continuous surveillance and highly regimented reward and punishment structures, ABA interventions for autism enforce cisgender, heterosexual, and ablenormative behavioral and communication norms on non-conforming bodyminds (Yergeau, 2018). ABA additionally targets verbalization over other forms of non-normative autistic communication (see, e.g., Alper, 2017; In my language, 2007; Zisk and Dalton, 2019), the reduction of autistic stereotypy (see, e.g., American Psychiatric Association, 2013), and, broadly, teaching the importance of compliance (see Sandoval-Norton, et al., 2019; Roscigno, 2019; Yergeau, 2018).

As Ne’eman (2021) explains, interventional paradigms that use “passing” as non-disabled as a primary outcome variable ignore the immediate stress that passing places on autistic people and the long-term health consequences of passing. Indeed, there appears to be a relationship between “camouflaging” [7] — the laborious process of hiding one’s autism and behaving and communicating in non-autistic ways — and the disproportionately high rates of autistic suicide (Beck, et al., 2020; Cassidy, et al., 2020, 2018; Rose, 2018). Autistic survivors of behaviorist intervention therapies note that these therapies don’t cure them. The therapies scare them into masking subconsciously to avoid the punishment they’ve been conditioned to expect if they behave and communicate like themselves [8]. In essence, prevailing behavioral autism intervention paradigms are profoundly harmful and predicated upon curative (Kim, 2017) and normative (Butler, 2004) violence.

As an overarching field, HCI often ostensibly values progress and social good (Bowers, 2018; Irani, et al., 2010; Lin and Lindtner, 2021; Pal, 2017). Yet, as Spiel, et al. (2019) and Williams and Gilbert (2020) show, a considerable proportion of autism research in HCI is complicit in behaviorist autism intervention paradigms — projects of domination, control, and destruction (see Roscigno, 2019). Such research within a progress-driven field may appear paradoxical. However, several factors may help explain this vein of HCI research. First, discourses that objectify autistic people may obscure their needs and agency, allowing for interventional research framed in terms of progress for non-autistic research beneficiaries. Second, existing political economies within and outside of HCI may incentivize objectifying autism technology research and facilitating narratives thereof.

Robo-therapy, for the state

In a field where essentialist claims about the likeness of autistics to machines are naturalized, autistics are cast as inhuman. They are cast as unliving. They are framed as things. Raw materials to be acted upon (Broderick and Roscigno, 2021; Roscigno, 2019). If a typewriter breaks, you might repair it or throw it away. If a computer freezes or starts performing erratically, you might restart or ignore it and walk away. If autistics are essentially like computers and typewriters, if they are just things, then are their peculiar and problematized features similarly dismissible? Evidently so, at least some of the time, within HCI (see Williams and Gilbert, 2019; Ymous, et al., 2020). More often, though, as is evidenced by the proliferation of interventionist autism HCI research, autistics are subjects for repair.

As Lin and Lindtner (2021) explain, many HCI research activities are significantly motivated by ideals of progress, utility, and productivity. The field is driven by a sense of doing good — working towards (typically Western conceptions of) social progress (see Irani, et al., 2010; Lin and Lindtner, 2021; Pal, 2017). Yet, as Ymous and colleagues indicate, “what is allowed to be understood as ‘doing good’ is reliant upon entrenched sociocultural traditions of ableism” [9]. To render autistics less visibly autistic is an assumed good in HCI research supporting behavioral autism interventions. “Rehabilitation” of robo-autistics is an uninterrogated good, despite growing evidence of harm (Bascom, 2012; Beck, et al., 2020; Cassidy, et al., 2020, 2018; Rose, 2018; Unbound Books, n.d.). The unchallenged assumption about the goodness of ABA is not unique to HCI. Instead, it indicates an overlap between the ideals of HCI and ABA. There is a convenient fit between robo-autism discourses and ABA’s positioning of pre-intervention autistics as not-quite-people, not yet deserving of rights (see Roscigno, 2019).

ABA shares HCI’s commitments to the logics of progress and utility (Roscigno, 2019). As Roscigno (2019) explains, ABA and similar interventions can be thought of in terms of biophilanthropy. Biophilanthropy is a term coined by Schuller (2018) to explain the use of surveillance and control for the good of the state and economy but under the guise of charity. ABA interventions are nominally about helping autistic people move beyond the idiosyncrasies of autism and join the rest of society as productive and useful citizens (Roscigno, 2019). The notion that autistics must be rehabilitated — repaired — to have value within liberal capitalism perpetuates and justifies the violences of behavioral autism interventions (Roscigno, 2019). Further, various autism discourses (including those about technology) rhetorically distance autism from humanity, obfuscating the personhood of these biophilanthropic recipients while simultaneously normalizing the violences they endure as necessary to the well-being of the neoliberal state (McGuire, 2016; Puar, 2017; Roscigno, 2019; Rottier, et al., 2022). Within a neoliberal society, to be legible as human and deserving of citizenship and rights, autistic people must be made capable of productive economic participation (Mitchell and Snyder, 2015; Roscigno, 2019).

Robot research extends beyond behavioral autism interventions in HCI, purporting to help the individual for the good of economic and state interests in many other contexts, too. DiSalvo (2018) notes that design research often seeks to support individual behavior change in the face of economic and social crises. For example, in response to the horrific realities of global climate change, an interaction designer may be likelier to make an app to help individuals track their carbon production rather than acknowledging or addressing governmental policy and industrial practices. DiSalvo describes this trend toward behaviorism and technosolutionism as “the abdication of the issue in favor of the (purportedly) tractable problem” [10]. We do not intend to suggest that HCI researchers harbor malicious intent towards autistics through their involvement in behaviorist autism interventions. Instead, a flawed sense of for whom our research should do good and by whom research is led, paired with the naturalization and adaptation of objectifying autism discourses within broader society, creates the perfect storm for fallaciously thinking supporting ABA constitutes progress and social good. Structural conditions may preclude researchers and designers from arriving at alternative ideas of “doing good,” such as working towards dismantling the structures that naturalize maiming (see Puar, 2017) of autistic children in the name of progress. However, while HCI researchers in this space likely make well-meaning errors (see Mankoff, et al., 2010) [11] and without intending to cause harm, they benefit from the political economies of autism medicalization and intervention, which may disincentivize them from adopting alternative research programs.

Robo-therapy, for HCI researchers

Mallett and Runswick-Cole (2016) argue that autism has become a commodity, bought and sold by myriad actors, from governments, to companies, to non-profit organizations, and to academic researchers. Broderick and Roscigno (2021) frame this commodification of autism as the autism-industrial complex (AIC). Through its involvement in autism intervention, HCI is entangled in the diverse political economy of those who trade in autism. This marketplace is built upon the construction of autism as a distinct ontological category -a deficient category positioned as demanding remediation (Broderick and Roscigno, 2021; Mallett and Runswick-Cole, 2016). A diversity of diagnostic, educational, behavioral, and therapeutic professions exist in response to biomedical and dis-animating constructions of autism and comprise the AIC. Autism is their capital. Yet, paradoxically, their work often tries to convert the things they trade into people, able to participate in civic and economic life. Thus, as Broderick and Roscigno (2021) explain, the various arms of the AIC have an imperative to both perpetuate the classification of autism and autistic people as either broken people or inhuman things and ensure a continual supply of autistic raw materials. We see this in massive lobbying efforts, which have succeeded in legislation deeming ABA as the only intervention for autism coverable by insurance within the United States (Roscigno, 2019). When a child is diagnosed with autism, their parents are given pamphlets about ABA and told it is the only available option for their child. Thus, parents become enrolled in the AIC, creating additional advocates for the continuance and advancement of autism services. And, to complete the cycle, service providers can claim they are simply fulfilling market demand driven by parents of autistic children. Even when evidence emerges that institutionalized conceptions of autism are incomplete or incorrect, these incidents, too, become opportunities for the AIC.

Classically, autism is constructed as an essentially male disorder. Baron-Cohen (1995) went so far as to theorize autism as arising from “extreme male brains” — that is, autism represents extreme versions of stereotypically male characteristics like rationality, mathematical and technical aptitude, and lower (relative to females) empathy. Such a theory rests on gender essentialism and harmful stereotypes. We suspect the male-coded construction of autism is likely implicated in the attachment of technological affinities as an essentialized characteristic of autism and the readiness with which techno-autism discourses were adopted. In recent years, various facets of the AIC have had to reckon with the emergence of so-called “female-autism” as a clinical construct. The construction of “female-autism” exemplifies the AIC’s co-optation of evidence incompatible with dominant autism narratives (Moore, 2021). As Moore (2021) explains, attempts to codify “female-autism” as a distinct autism phenotype expand clinical definitions of autism to include gender-essentialist constructions of girls and women, alongside those of the “extreme male brain.” Moore (2021) further articulates that by widening the clinical criteria for autism to include girls and women, including those who present as cisgender, rather than as particularly (extreme) “male” in their demeanor and interests, new markets are created. By potentially doubling the population from which clinicians can legitimately sample for autistic capital, the AIC grows.

Through its adoption of essentializing claims about autism and technology, many autism technology researchers are swept up by and complicit in the AIC. In adopting mechanical and robotic constructions of autism as a thing, HCI autism researchers provide the appearance of scientific legitimacy to objectifying claims that sustain the helping professions. Too, in developing technological interventions for techno-autistics, they create and expand new markets for autism technologies. And through their participation in the AIC, researchers stand to gain status, publications, grants, patents, and personal income. The field’s entanglements with the AIC are not limited to a few autism researchers but to the ideologies of progress and utility underpinning much of the field itself. Because HCI is a field that has recognized curative violence as a social good (sometimes explicitly, through “social impact” awards), individual researchers have an incentive to conduct AIC-complicit research. While the AIC is pervasive, it is sustained by myriad, entangled political economies, such as that of HCI and its drives towards industrial production, innovation, usefulness, and progress (Dourish, 2018; Lin and Lindtner, 2021). Therefore, changing the nature of autism research in HCI must entail addressing not individual research programs but rather systemic and intersecting structures within and outside of the field, incentivizing certain kinds of research.



Not robots; Cyborgs

Creating enduring and systemic change is a monumental challenge and beyond the scope of what any single paper can hope to accomplish. Instead, we aim to challenge the robotic and computeristic autism discourses that underpin so much autism technology research. The neurodiversity paradigm (see Kapp, et al., 2013; Singer, 2017) frames autistic differences as variations in human experience and the presentations thereof. Within this framework, autism, disability, and impairment are normalized. Within this framework, it is not the humanity of autistic people that is suspect. Instead, what is suspect is the humanity of an institutionalized conception of personhood that casts autistic people as less human than non-autistics and machines alike. As a replacement for autistic-as-robot rhetorics, we invoke the figure of the (crip) cyborg. In providing alternative framings for understanding autistic relationships with technology, we hope to enroll additional researchers into the growing ranks of HCI scholars attempting to resituate autism research around the wants and needs of autistic people (e.g., Kender and Spiel, 2022; Keyes, 2020; Ringland, 2019; Spiel, et al., 2019; Spiel and Gerling, 2021; Williams and Gilbert, 2019; Ymous, et al., 2020).

Cyborg affinities along the cyborg/tryborg continuum

In classic (Haraway) cyborg theory, cyborgs are defined by what they are not [12] — by the multiplicity of categories in which they (refuse to) fit and the boundaries they transgress (Haraway, 1990). Haraway provides four definitions for a cyborg: (1) a “cybernetic organism”; (2) “a hybrid of machine and organism”; (3) “a creature of social reality”; (4) “a creature of fiction” [13]. As with Suchman’s (2011) co-constructed human-esque robots, Haraway’s (1990) cyborgs comprise an artifact/category that challenges existing dualisms between organism and machine and between subject and object. Cyborg bodies and existences are often imagined as those that, through highly integrative technology use, can transcend non-augmented bodily limitations. Such conceptions of the cyborg are critiqued by disabled writers, who dispute Haraway’s notion of the cyborg as “a creature of fiction” [14].

Weise (2018) notes Haraway’s cyborg is a metaphorical creature. Too, in common usage, Weise notes that the cyborg represents technological augmentation used to bring about superhuman capacities. Weise (2018) writes that these popular conceptions discount the embodied experiences of the disabled people who, for years, have assimilated variegated technologies to facilitate their continued existence. These disabled people, cyborgs by Haraway’s (1990) second definition, predate Haraway’s cyborg fictions, which imagine the cyborg as something not yet here or something one might choose to be (Weise, 2018). Technology use makes critical bodily functions possible for these disabled people, whom Weise (2018) refers to as common cyborgs. Disability studies scholars and common cyborgs point out the importance of recognizing that cyborg non-fictions are not always as shiny and exciting as cyborg-as-superhuman narratives would have us believe (Earle, 2019; Kafer, 2013; Shew, 2020, 2017; Weise, 2018). Cyborg existence often entails various costs, limitations, and painful bodily sensations associated with one’s cyborg appendages. Weise (2016) refers to people who expand their capabilities via technology for reasons other than necessity (e.g., leisure, aesthetics, productivity, or other luxuries) as tryborgs. These not-quite-cyborgs, cyborgs of metaphor and aspiration, differ from disabled cyborgs because of the ways privilege and power differentiate between their technology usage (see Nelson, 2020). Tryborgs imagine metaphorical cyborg futures without recognizing the historical and continuing existence of non-metaphorical disabled cyborgs (Weise, 2018, 2016).

Weise has used prosthetic legs for decades (Weise, 2016), but identifies a specific moment in 2010 as marking when cy [15] became a cyborg — when cy transitioned from a purely mechanical to a prosthetic knee (Wong, 2019). Although Weise’s identification as a cyborg relates to integrating computerized parts with cy’s body, cy is open to all disabled people claiming cyborg (Wong, 2019). For example, Shew considers herself a cyborg, despite not having any computerized parts (Wong, 2019). Scholars have interpreted Weise and Shew to construct different definitions of what it means to be a cyborg. Earle (2019) delimits cyborgs to those disabled people with necessary (for facilitating activities and functions of daily living) technological artifacts that are “within the body, strapped to it, or carrying the body in some way” [16]. Yet, Weise’s cyborg construct includes those whose continued existences are maintained by less mechanical technologies, like anti-depressants. Williams and Gilbert (2019) define cyborgs as “those of us whose life and cognition are reliant and co-constituted by the existence of an interface and technology” [17]. This definition is helpful within the context of autism, as it explicitly includes people with learning, intellectual, and cognitive disabilities whose cognitive functioning is technologically aided.

There are myriad instances of autistic technology use in non-interventional contexts to facilitate daily functioning. In the case of mobile computers as alternative and assistive communication (AAC; see Alper, 2017; Zisk and Dalton, 2019) devices, essential communicative capacities are moderated by whether an autistic person has access to appropriate technologies. Harrison and colleagues (2019) explore ways autistic people appropriate off-the-shelf mobile computing devices to mediate undesirable / curate desirable sensory experiences for stress management. Zisk and Dalton (2019) address how autistics effectively use these same devices to communicate non-verbally (see, also, Alper, 2017). In considering William and Gilbert’s (2019) cyborg alongside that of Weise (2018), there is an argument that these autistic technology use cases are cyborgian.

Importantly, disabled people are not alone in cyborgian necessary technology use. Transgender people represent another example of cyborg becoming. Transness and disability, and autism, in particular, share some parallels. First, trans is not a category completely separate from disability: many trans people are also disabled, and many disabled people are also trans (Bumiller, 2008; Warrier, et al., 2020). Sex and gender are sometimes used as lenses to understand autism, such as via the common rhetoric (conceived by Baron-Cohen) that autism is a case of “extreme male brain” (Jack, 2011). Further, dominant medical establishments often consider both autism and transness as traits to extinguish, and ABA shares some common history with trans “conversion” therapy efforts (via Lovaas, who was involved in both) (Gibson and Douglas, 2018). In their resistance to these conversion/extinction efforts, trans rights and disability rights also share parallels (Bumiller, 2008).

Trans people often become cyborgian by incorporating technology to fundamentally change their bodies (O’Shea, 2020). Some trans people incorporate technology into their bodies as part of medical gender transition processes, like surgeries and hormones (Gill-Peterson, 2014). Many trans surgeries involve implants incorporated into a person’s body to physically manifest gender change. Hormones used for gender change are a technology that fundamentally changes a person’s biology. In other cases, technologies can augment or expand trans people’s experiences, bodies and identities (Haimson, et al., 2020). As a few examples, in Cárdenas’ projects Becoming Dragon (2010) and Becoming Transreal (2012), virtual reality systems enable gender exploration, and for Nelson (2020), hand chip implants are queer cyborgian approaches. Stone (1995) described trans people as cyborgs because of how they sometimes incorporate technology into communication practices (in what she called “prosthetic communication”). Too, Stone (1995) suggests trans people are rendered cyborgs by their status as “boundary creatures,” occupying liminal spaces not only related to gender but also at the edges of the human/machine boundary. Intersecting trans and disabled identities mean that some people are cyborgs across multiple dimensions.

While non-disabled and cisgender people also sometimes use technologies that augment their experiences, these augmentations do not raise to the level of cyborg status (Weise, 2016). As Nelson (2020) articulates, two unique elements — which trans and disabled people share — are necessary for claiming cyborg: (1) marginality (i.e., augmentations to one’s body are intertwined with one’s marginalized identity/ies); and (2) change (augmenting ones body changes its fundamental makeup, whether physically, biologically, or in terms of how one relates to the world).

In some cases, other marginalized groups, such as women and racial minorities, may also be considered cyborgs. For instance, Nakamura (2014) drew attention to the gendered labor of technology work by discussing how a community of Navajo women may be regarded as “natural” cyborgs in relation to their technologically mediated factory work. And of course, Haraway (1990) famously described how women could be cyborgs, particularly women of color. Yet some critique Haraway’s cyborg theory as ignoring power differentials between white women and women of color (DeCook, 2021; Puar, 2020; Sandoval, 2000; Schueller, 2005; Wilkerson, 1997). Thus, applying Haraway’s theory in disability and trans contexts, which must meaningfully involve intersectionality, is fraught. Cyborg status, then, may involve a third element: (3) multiple layers of marginality.

Within disability studies, some, like Kafer (2013), express concerns about the degree to which Harawayan cyborg theory risks portraying technological relationships as shared, yet depoliticized, phenomena devoid of references to power differentials. Still, Kafer (2013) finds utility in cripping the cyborg. Like queer(ing), crip(ping) is both a noun and a verb (see Sandahl, 2003). Hamraie and Fritsch write that to crip is to assert “the noncompliant, anti-assimilationist position that disability is a desirable part of the world” [18]. Kafer writes, “Cripping the cyborg, developing a non-ableist cyborg politics, requires understanding disabled people are not cyborgs because of our bodies (e.g., our use of prosthetics, ventilators, or attendants), but because of our political practices ... Cripping the cyborg, in other words, means recognizing that our bodies are not separate from our political practices; neither assistive technologies nor our uses of them are ahistorical and apolitical” [19]. Kafer’s (2013) reconceptualization of the crip cyborg (or “cripborg,” to borrow a term coined by Stevens in Nelson, et al., 2019) requires that we look beyond the interfaces between disabled bodyminds and machines (e.g., autistic children and humanoid robots) to the effects and assumptions of and within these technologies and their disabled usage.

Autistic cripborgs (and what their existence means for HCI): Centering disabled people’s needs and knowledges

A critical distinction between things and people is that we attribute agency and self-determination to one but not the other. Within robot-assisted autism interventions, Spiel and colleagues (2019) note that autistic children have little or no agency in determining how they interact with the intervening machines, nor are they permitted to engage in sense-making of or with the robots. Autismrobot research appears unconcerned with achieving mutual intelligibility (see Suchman, 2006) between autistic children, robots, and/or non-autistic clinicians and caretakers. Instead, a primary purpose of robot studies can be understood as an effort to render the autistic child intelligible to the non-autistic other. The socialized robot now plays a role in the ablenormative socialization of the (robot-like) autistic child. The significance of imbalances in the order and direction of these relationships go unquestioned in this literature. A hierarchy seems to exist in which humans and robots can act upon one another, but wherein autistics can only be acted upon. Autistics are configured as so “epistemically absented” [20], so distant from intelligibility, that they have less agency and capacity for sense-making than the robots to which they are so often compared.

In contrast to (dominant constructions of) autistic robots, or robotic autistics, cripborgs have agency. Cripborgs have politics (Kafer, 2013). Autistic technology use is inherently political, if only because systemic power structures and the AIC construct autistics-as-robots discourses as apolitical. Claims of “natural affinities” may obfuscate the politics of autistic embodiment and practice through convoluted citational practices and inadvertent influence of broader cultural tropes and market forces. However, making essentialized claims about minoritized people will always be political. Importantly, autistics are not cripborgs solely because of the technological practices imposed upon them by behaviorists and technology researchers. Autistics are cripborgs because of the ways they appropriate existing technologies to make space for and community with themselves and to advocate for their recognition as agentic.

The Internet has been an integral technological component of autistic communities and the autistic self-advocate movement for decades. Sinclair (2012a) writes about how, as the autistic self-advocacy movement was first coalescing, autistic community members shared online spaces with parents of autistic people. Parents led the prominent autism organizations, organized the autism conferences, and set the agendas for policy- and research-oriented advocacy (Sinclair, 2012a). Sinclair (2012a) recalls themself and fellow autistics posting on a parent-dominated forum in the early 1990s in the days following a large autism conference. The parents rebuked the autistic forum users. The parents said the autistic posts were of no interest to them. They had trouble grasping the notion that autistics might conceivably be interested in ongoing discussions about themselves and that the topics of importance to autistics might be relevant to organizations that ostensibly exist to support them.

Before long, an autistic person configured a forum by and for autistic people. Parents were allowed to join, but this forum was autistic space, and parents were subject to community norms (Sinclair, 2012a). To our knowledge, this event — along with the year-long autistic vs. parent flame-war that followed — is the first example in a long history of autistics leveraging digital spaces to reclaim narratives about themselves. We can see current-day examples of this kind of online activity on many forums, personal blogs, and popular social media sites (see, e.g., Osorio, 2020; Seidmann, 2020). Before Murray and Lesser (1998b, 1998a) ever wrote about autistic biology sharing similarities with computer architecture, autistics were using computers not because of an affinity for computers but out of an affinity for one another. They used computers and communication technologies to organize. To plot out a political agenda (Sinclair, 2012a). Maybe autistics do have a “natural affinity for technology;” perhaps confirmatory evidence for this claim will emerge from a groundbreaking study tomorrow. But this autistic computer use would still be cripborgian, agentic, and political, rather than robotic.

Were autism researchers in HCI to understand autistics as cripborgs rather than computers, it would not spell the end for the AIC. Researchers could continue reaping the benefits of autism research (publications, grants, income and accolades) but could also, at least potentially, provide direct benefits to the autistics within their research studies. The frameworks of crip (Hamraie and Fritsch, 2019) and neuroqueer (Rauchberg, 2022) technoscience can help explain this shift from research about autistics in service of non-disabled needs to autism research in favor of autistic needs.

Hamraie and Fritsch (2019) articulate crip technoscience as a means for designers to align themselves and their work with disabled people’s activist-oriented needs and knowledges. They place crip technoscience in opposition to disability technoscience, which refers to more traditional forms of individualizing, isolationistic, biophilanthropic and otherwise (techno)ableist (see Shew, 2017) approaches to disability and technology. We need a shift within autism technology research in HCI that moves autistic people from passive recipients of violent “charity” to leaders in designing their technological futures. Researchers more familiar with robo-autism than cripborgian autism might look to crip technoscience and its four central commitments for inspiration.

The first commitment of crip technoscience is to center the work of disabled people as knowers and makers (Hamraie and Fritsch, 2019). Williams and Gilbert (2019) review disability technologies in HCI and find that HCI researchers struggle to recognize ways their work might adversely impact their users. They suggest this difficulty stems from researchers’ lack of familiarity with the embodied and lived experiences of disabled people. Clinical autism discourses paint autistic people as devoid of self and self-knowledge (e.g., Baron-Cohen, 1995; see, also, Yergeau, 2013). Cripborg autistic existence challenges this notion by highlighting the politics and intentionality of autistic technology use. This commitment to disabled knowledges and to autistics-as-cripborgs suggests HCI researchers should reevaluate their methodological approaches to autism technology research, favoring participatory methods in which autistic people (as opposed to their non-autistic parents, teachers and caregivers) are enrolled as crucial stakeholders and subject-matter experts. Given issues around working with people with intellectual disabilities and the need for parental consent when working with minors, building and navigating these methodologies will require care and nuance. Spiel and colleagues (2018) model the necessary reflexivity and nuance for HCI researchers interested in ensuring the wellbeing of disabled research participants.

The second commitment of crip technoscience is to “access as friction” [21]. This commitment rejects disability design and intervention forms wherein inclusion and accessibility are synonymous with assimilation. Ymous and colleagues (2020), in a paper some HCI researchers felt was contentious, described ways that expectations of non-disablement within HCI are at odds with the realities existing as a disabled researcher within the field. They call out the violence inherent to sweeping ableism within HCI. It would be entirely understandable for HCI researchers involved in the types of practices Ymous and colleagues (2020) point out to feel uncomfortable. Yet, a commitment to “access as friction” asks these researchers not to look the other way — to stay with the trouble, so to speak (see Haraway, 2016). Places where the demands of disabled people diverge from dominant research priorities and perspectives in HCI can be generative. Williams and Boyd (2019) demonstrate that, by not shying away from these frictions, HCI researchers can build coalitions and work towards defining and practicing alternative types of research mutually beneficial to all parties. Such papers, like the present papers, can be treated as invitations to participate in new research (autistic-led) programs rather than merely as condemnations. In staying with the friction and accepting these invitations, researchers may avoid the pitfalls inherent to dominant approaches to disability design research in HCI (see, e.g., Shew, 2020, 2017).

The third commitment of crip technoscience is to interdependence as a political technology (Hamraie and Fritsch, 2019). Hamraie and Fritsch write:

We position the crip politics of interdependence as a technoscientic phenomenon, the weaving of relational circuits between bodies, environments, and tools to create non-innocent, frictional access. Mainstream disability technoscience presumes disability as an individual experience of impairment rather than a collective political experience of world-building and dismantling. This perception has two primary consequences. First, disabled people are perceived as dependent and the goal of technoscience becomes to encourage independence. Second, disability and technology are both perceived as apolitical and stable phenomena, rather than material-discursive entanglements that take shape through struggle, negotiation, and creativity. [22]

Traditional approaches to autism technology research support behavioral interventions concerned with rendering autistic people independent. Yet, despite Western society’s extreme valuation of the notion of independence, independence is a lie. None of us are independent. We are all interdependent on one another (Mingus, 2017). As Mingus writes, very few of us grow our own food, produce our own clothing, or create our own energy sources. Interdependence is a fundamental condition of human existence, yet our self-narratives of and preferences for individuality obscure this truth. In committing to interdependence as a political technology, HCI researchers have an opportunity to — in concert with autistic people — define alternative outcomes to autism technology research that promote care and support (see Piepzna-Samarasinha, 2018) rather than independence and normalization (see, e.g., Bennett, et al., 2018). Cripborgian framings of autism can help identify forms of interdependence that might warrant exploring (see, also, Bennett, et al., 2018). As computers and robots, autistics are fundamentally solitary and asocial, sharing more affinities with machines than with people. As cripborgs, autistics share affinities not only with one another, but with other groups of cyborgs. Cross- and trans-identity cyborg affinities represent coalitional and intersectional technology research opportunities with broad impact beyond any single identity group. Working on technology research to support coalitional forms of interdependence can, perhaps, satisfy HCI researchers’ desires to “do good” while also maximizing the number of people who might benefit from their research.

The fourth commitment of crip technoscience is to disability justice (Hamraie and Fritsch, 2019). The term disability justice was coined by disabled queer and trans people of color comprising the Disability Justice Collective (see Piepzna-Samarasinha, 2018). Disability justice differs from disability rights. Rather than focusing on state-granted privileges, it seeks to dismantle the root causes of ableism: the intersecting forces of heteropatriarchy, white supremacy, colonialism, and capitalism (Sins Invalid, 2019). For some academics, particularly those within fields like HCI oriented around industrial progress and innovation (Dourish, 2018), perhaps adopting anti-capitalism as a tenet of good research is a bridge too far. But, researchers might operationalize a commitment to disability justice in autism technology research by ensuring research projects include and are led by those most affected by intersecting forces of oppression. Historical autism research has focused on autism as a male phenomenon (Moore, 2021). Researchers might look for ways to support the techno-political practices of, for example, autistics who are women, queer, trans and/or Black, Indigenous, and people of color. This commitment calls on researchers to sort out what it means to work in close collaboration with autistics with significant intellectual disabilities while remembering always to presume competence (see Bascom, 2014). As Costanza-Chock (2020) explains, design justice is a framework for understanding and designing against hegemonic power structures. The framework is significantly built around the precepts of disability justice (see Costanza-Chock, 2020) but, unlike disability justice, has been adapted to be directly applicable to HCI. Following the tenets of design justice may help HCI researchers operationalize disability justice within their work.

Rauchberg (2022) elaborates neuroqueer (see Roscigno, 2019; Walker, 2021; Yergeau, 2018) technoscience as an extension to crip technoscience, focusing on the liberatory and interdependent technological futures of autistic and otherwise neurodivergent people, particularly with respect to digital and communication technologies. Building from the standpoint that crip technoscience and examples of its implementation (see Hamraie and Fritsch, 2019) privilege neurotypicality, Rauchberg’s framework explicitly extends crip technoscience’s commitments to interdependence and the recognition of disabled people as knowers and makers to disabled minds, as well as bodies. Too, Rauchberg’s (2022) framework goes beyond positioning itself in opposition to disability technoscience, clearly and specifically rejecting curative violence (see Kim, 2017). Holding in mind these tenets of neuroqueer technoscience, and an opposition to curative violence in particular, HCI researchers might recognize the potential of autistic-led autism technology research to champion liberation and adaptation, rather than assimilation and control.




Popular autism discourses are often widespread and deceptively pernicious. They become so naturalized that they go unquestioned. But, as with any assumptions, particularly those about groups of people to which someone does not themselves belong, critical examinations are warranted from time to time. We should avoid justifying research on or with autistic people in claims that sound true. While mechanical constructions of autism may align with broader, cultural autism narratives, claims about “natural affinities for technology” lack support and may cause harm. Positioning autistic people as or like robots (i.e., things) may normalize forms of coercive intervention that researchers would question if only they more clearly saw their subjects as people. We suspect that despite any particular benefits they may reap from their research, most autism technology researchers are acting with good intentions. In outlining how those intentions are misguided and complicit in violence, we hope those researchers will reconsider their future research approaches. In providing alternative ways to understand autistic people (as cripborgs, rather than computers) and suggesting crip technoscience as a framework for research grounded within this new understanding, we hope to enroll these well-intentioned researchers in the project of dismantling anti-autistic ableism, both in and beyond HCI. End of article


About the authors

Josh Guberman (he/him) holds a B.S. in psychology from the Illinois Institute of Technology, and is currently a Ph.D. candidate at the University of Michigan School of Information. As an autistic and otherwise disabled scholar-activist, he draws on the work of disability justice activists, critical disability studies, and feminist science and technology studies to interrogate ableism embedded within academic disability-technology research and development.
E-mail: guberman [at] umich [dot] edu

Oliver Haimson is an assistant professor at the University of Michigan School of Information and a recipient of a National Science Foundation CAREER award. He conducts social computing research focused on marginalized individuals’ and communities’ experiences presenting and exploring identity via sociotechnical systems, particularly during times of identity change. He has a Ph.D. in information and computer sciences from the University of California, Irvine.
E-mail: haimson [at] umich [dot] edu



Josh gratefully acknowledges Hellen Rottier for suggesting he expand a brief and confused blog post about autistic cyborgs into a full (and fully thought-out) paper. This work was partially supported by the National Science Foundation grant number DGE156260.



1. Throughout this paper, we use identity-first language (“autistic person”) over person-first language (“person with autism”). This choice reflects the preference of the first author, who is autistic, respects a general preference among many autistic adults (see Botha, et al., 2021; Brown, 2011; Bury, et al., 2020; Gernsbacher, 2017; Kenny, et al., 2016) and resists the physical, anti-autistic violence person-first constructions may help normalize (McGuire, 2016).

2. Yergeau, 2018, p. 19.

3. Yergeau, 2018, p. 7.

4. Bettelheim, 1959, p. 248, quoted in Williams, 2021.

5. Each of these papers we cite as referring to autistic/technology relationships in terms of “natural affinities” was published within the last 10 years, indicating the continued and contemporary nature of this claim.

6. A complete meta-analytical exploration of this claim is beyond the scope of this paper. However, we informally examined 16 academic HCI-centric publications containing the claim that autistics have a “natural affinity” to/for technology published between 2010 and 2022, the 15 sources they cited for this claim, and the primary sources cited by the secondary sources within that group of 15 papers. This informal examination informs our discussion of how claims about “natural affinities” to/for technology function in autism literature. A more formal and structured meta-analysis of this citational tree is likely warranted, given the evident popularity of “natural affinity” claims.

7. While academic literature frequently uses the word “camouflaging” to refer to social coping strategies whereby autistics try to blend in as non-autistic, this behavior is more commonly referred to as “masking” within the autistic community — a terminology preference that new academic literature is just beginning to reflect (see, e.g., Pearson and Rose, 2021).

8. See Unbound Books (n.d.) for a curated list of first-person accounts about autistic experiences of ABA.

9. Ymous, et al., 2020, p. 3.

10. DiSalvo, 2018, p. 481.

11. Albeit, significantly impactful errors.

12. Similarly to how autistics are defined according to their lacks (see Yergeau, 2018), albeit, importantly, under considerably different power dynamics.

13. Haraway, 1990, p. 149.

14. Ibid.

15. Weise’s pronouns are cy (Cy [@JillianWeise], 2020).

16. Earle, 2019, p. 48.

17. Williams and Gilbert, 2019, p. 3.

18. Hamraie and Fritsch, 2019, n.p.

19. Kafer, 2013, p. 120.

20. Yergeau, 2018, p. 54.

21. Hamraie and Fritsch, 2019, n.p.

22. Ibid.



Meryl Alper, 2017. Giving voice: Mobile communication, disability, and inequality. Cambridge, Mass.: MIT Press.
doi:, accessed 12 December 2022.

American Psychiatric Association, 2013. Diagnostic and statistical manual of mental disorders. Fifth edition. Arlington, Va.: American Psychiatric Association.

Duncan E. Astle and Sue Fletcher-Watson, 2020. “Beyond the core-deficit hypothesis in developmental disorders,” Current Directions in Psychological Science, volume 29, number 5, pp. 431–437.
doi:, accessed 12 December 2022.

Simon Baron-Cohen, 1995. Mindblindness: An essay on autism and theory of mind. Cambridge, Mass.: MIT Press.
doi:, accessed 12 December 2022.

Simon Baron-Cohen, Alan M. Leslie and Uta Frith, 1985. “Does the autistic child have a ‘theory of mind’?” Cognition, volume 21, number 1, pp. 37–46.
doi:, accessed 12 December 2022.

Bascom, 2014. “Dangerous assumptions,” Just Stimming..., at, accessed 13 May 2022.

Julia Bascom, 2012. Loud hands: Autistic people, speaking. Washington, D.C.: Autistic Press.

Jonathan S. Beck, Rebecca A. Lundwall, Terisa Gabrielsen, Jonathan C. Cox and Mikle South, 2020. “Looking good but feeling bad: ‘Camouflaging’ behaviors and mental health in women with autistic traits,” Autism, volume 24, number 4, pp. 809–821.
doi:, accessed 12 December 2022.

Cynthia L. Bennett, Erin Brady and Stacy M. Branham, 2018. “Interdependence as a frame for assistive technology research and design,” ASSETS ’18: Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 161–173.
doi:, accessed 12 September 2022.

Bruno Bettelheim, 1959. “Joey: A ‘mechanical boy’,” Scientific American, volume 200, number 3, pp. 116–127.

V. Joy Boitnott, 2012. “The use of technology to teach children with autism spectrum disorder: A meta-synthesis,” thesis, University of Alaska Southeast, at, accessed 3 May 2022.

Monique Botha, Jacqueline Hanlon and Gemma Louise Williams, 2021. “Does language matter? Identity-first versus person-first language use in autism research: A response to Vivanti,” Journal of Autism and Developmental Disorders.
doi:, accessed 9 April 2021.

John Bowers, 2018. “Michel Foucault on the panopticon: A commentary,” In: Jeffrey Bardzell, Shaowen Bardzell and Mark Blyth (editors). Critical theory and interaction design. Cambridge, Mass.: MIT Press, pp. 650–677.

Jorge Brandão, Pedro Cunha, José Vasconcelos, Vtor Carvalho and Filomena Soares, 2015. “An augmented reality GameBook for children with autism spectrum disorders,” ICELW 2015 — International Conference on E-Learning in the Workplace.

Alicia A. Broderick and Robin Roscigno, 2021. “Autism, Inc.: The autism industrial complex,” Journal of Disability Studies in Education, volume 2, number 1, pp. 77–101.
doi:, accessed 12 December 2022.

Lydia Brown, 2011. “Identity-first language,” Autistic Self Advocacy Network, at, accessed 2 June 2019.

Kristin Bumiller, 2008. “Quirky citizens: Autism, gender, and reimagining disability,” Signs, volume 33, number 4, pp. 967–991.
doi:, accessed 12 December 2022.

Simon M. Bury, Rachel Jellett, Jennifer R. Spoor and Darren Hedley, 2020. “‘It defines who I am’ or ‘It’s something I have’: What language so [autistic] Australian adults [on the autism spectrum] prefer?” Journal of Autism and Developmental Disorders.
doi:, accessed 12 December 2022.

Judith Butler, 2004. Undoing gender. New York: Routledge.
doi:, accessed 12 December 2022.

Micha Cárdenas, 2012. The transreal: Political aesthetics of crossing realities. New York: Atropos Press.

Micha Cárdenas, 2010. “Becoming Dragon: A transversal technology study,” CTheory, at, accessed 12 December 2022.

Serena Caria, Fabio Paternò, Carmen Santoro and Valentina Semucci, 2018. “The design of Web games for helping young high-functioning autistics in learning how to manage money,” Mobile Networks and Applications, volume 23, number 6, pp. 1,735–1,748.
doi:, accessed 12 December 2022.

Vtor H. Carvalho, Jorge Brandão, Pedro Cunha, Jos Vasconcelos and Filomena Soares, 2015. “Tobias in the zoo — A serious game for children with autism spectrum disorders,” International Journal of Advanced Corporate Learning, volume 8, number 3.
doi:, accessed 12 December 2022.

Sarah Cassidy, Louise Bradley, Rebecca Shaw and Simon Baron-Cohen, 2018. “Risk markers for suicidality in autistic adults,” Molecular Autism, volume 9, article number 42.
doi:, accessed 12 December 2022.

S.A. Cassidy, K. Gould, E. Townsend, M. Pelton, A.E. Robertson and J. Rodgers, 2020. “Is camouflaging autistic traits associated with suicidal thoughts and behaviours? Expanding the interpersonal psychological theory of suicide in an undergraduate student sample,” Journal of Autism and Developmental Disorders, volume 50, number 10, pp. 3,638–3,648.
doi:, accessed 12 December 2022.

Sasha Costanza-Chock, 2020. Design justice: Community-led practices to build the worlds we need. Cambridge, Mass.: MIT Press.
doi:, accessed 12 December 2022.

Cy [@JillianWeise], 2020. “If you change your pronoun in the middle of the semester, do you tell your students? What if it is a pronoun that never existed before? Cy/She/Her now. Are my pronouns,” Twitter (20 November), at, accessed 11 May 2022.

Julia R. DeCook, 2021. “A [white] cyborg’s manifesto: The overwhelmingly Western ideology driving technofeminist theory,” Media, Culture & Society, volume 43, number 6, pp. 1,158–1,167.
doi:, accessed 12 December 2022.

Sara Katherine Dees, 2021. “After-school STEM programs and the impact on 21st century skill Development,” Master’s thesis, University of Houston-Clear Lake, at, accessed 3 May 2022.

Carl DiSalvo, 2018. “Bruno Latour as sociologist and design theorist?” In: Jeffrey Bardzell, Shaowen Bardzell and Mark Blyth (editors). Critical theory and interaction design. Cambridge, Mass.: MIT Press, pp. 470–484.

Paul Dourish, 2018. “Ideology and interpellation: Althusser’s ‘Ideology and ideological state apparatuses’,” In: Jeffrey Bardzell, Shaowen Bardzell and Mark Blyth (editors). Critical theory and interaction design. Cambridge, Mass.: MIT Press, pp. 407–416.

Joshua Earle, 2019. “Cyborg maintenance: Design, breakdown, and inclusion,” In: AAaron Marcus and Wentao Wang (editors). Design, user experience, and usability. Design philosophy and theory. Lecture Notes in Computer Science, volume 11583. Cham, Switzerland: Springer International, pp. 47–55.
doi:, accessed 12 December 2022.

Leigh Elliott, 2017. “Improving social and behavioural functioning in children with autism spectrum disorder: A videogame skills based feasibility trial,” thesis, Deakin University, Faculty of Health, School of Psychology, at, accessed 3 May 2022.

Sue Fletcher-Watson, 2014. “A targeted review of computer-assisted learning for people with autism spectrum disorder: Towards a consistent methodology,” Review Journal of Autism and Developmental Disorders, volume 1, pp. 87–100.
doi:, accessed 12 December 2022.

Christopher Frauenberger, Judith Good, Geraldine Fitzpatrick and Ole Sejer Iversen, 2015. “In pursuit of rigour and accountability in participatory design,” International Journal of Human-Computer Studies, volume 74, pp. 93–106.
doi:, accessed 12 December 2022.

Christopher Frauenberger, Judith Good, Alyssa Alcorn and Helen Pain, 2013. “Conversing through and about technologies: Design critique as an opportunity to engage children with autism and broaden research(er) perspectives,” International Journal of Child-Computer Interaction, volume 1, number 2, pp. 38–49.
doi:, accessed 12 December 2022.

Christopher Frauenberger, Judith Good, Alyssa Alcorn and Helen Pain, 2012. “Supporting the design contributions of children with autism spectrum conditions,” IDC ’12: Proceedings of the 11th International Conference on Interaction Design and Children, pp. 134–143.
doi:, accessed 7 June 2020.

Uta Frith, 1989. “Autism and ‘theory of mind’,” In: Christopher Gillberg (editor). Diagnosis and treatment of autism. New York: Springer, pp. 33–52.
doi:, accessed 27 October 2022.

Morton Ann Gernsbacher, 2017. “Editorial perspective: The use of person-first language in scholarly writing may accentuate stigma,” Journal of Child Psychology and Psychiatry, and Allied Disciplines, volume 58, number 7, pp. 859–861.
doi:, accessed 12 December 2022.

Margaret F. Gibson and Patty Douglas, 2018. “Disturbing behaviours: Ole Ivar Lovaas and the queer history of autism science,” Catalyst: Feminism, Theory, Technoscience, volume 4, number 2.
doi:, accessed 12 December 2022.

Julian Gill-Peterson, 2014. “The technical capacities of the body: Assembling race, technology, and transgender,” TSQ: Transgender Studies Quarterly, volume 1, number 3, pp. 402–418.
doi:, accessed 12 December 2022.

Karen Guldberg, Sarah Parsons, Kaśka PorayskaPomsta and Wendy KeayBright, 2017. “Challenging the knowledge-transfer orthodoxy: Knowledge co-construction in technology-enhanced learning for children with autism,” British Educational Research Journal, volume 43, number 2, pp. 394–413.
doi:, accessed 12 December 2022.

Oliver L. Haimson, Dykee Gorrell, Denny L. Starks and Zu Weinger, 2020. “Designing trans technology: Defining challenges and envisioning community-centered solutions,” CHI ’20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–13.
doi:, accessed 10 May 2022.

Aimi Hamraie and Kelly Fritsch, 2019. “Crip technoscience manifesto,” Catalyst: Feminism, Theory, Technoscience, volume 5, number 1.
doi:, accessed 12 December 2022.

Donna J. Haraway, 2016. Staying with the trouble: Making kin in the Chthulucene. Durham, N.C.: Duke University Press.
doi:, accessed 12 December 2022.

Donna Haraway, 1990. Simians, cyborgs, and women: The reinvention of nature. New York: Routledge.
doi:, accessed 12 December 2022.

Kristen Harrison, Lia Vallina, Amelia Couture, Halie Wenhold and Jessica D. Moorman, 2019. “Sensory curation: Theorizing media use for sensory regulation and implications for family media conflict,” Media Psychology, volume 22, number 4, pp. 653–688.
doi:, accessed 12 December 2022.

In my language, 2007. At (14 January), accessed 28 May 2019.

Lilly Irani, Janet Vertesi, Paul Dourish, Kavita Philip and Rebecca E. Grinter, 2010. “Postcolonial computing: A lens on design and development,” CHI ’10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1,311–1,320.
doi:, accessed 21 June 2021.

Jordynn Jack, 2011. “‘The extreme male brain?’ Incrementum and the rhetorical gendering of autism,” Disability Studies Quarterly, volume 31, number 3, at, accessed 27 October 2022.

Sheila Jasanoff, 2004. “Ordering knowledge, ordering society,” In: Sheila Jasanoff (editor). States of knowledge: The co-production of science and the social order. London: Routledge, pp. 13–45.
doi:, accessed 12 December 2022.

Alison Kafer, 2013. Feminist, queer, crip. Bloomington: Indiana University Press.

Gal A. Kaminka, 2013. “Curing robot autism: A challenge,” AAMAS ’13: Proceedings of the 2013 International Conference on Autonomous Agents and Multi-agent Systems, pp. 801–804.

Steven K. Kapp, Kristen Gillespie-Lynch, Lauren E. Sherman and Ted Hutman, 2013. “Deficit, difference, or both? Autism and neurodiversity,” Developmental Psychology, volume 49, number 1, pp. 59–71.
doi:, accessed 12 December 2022.

Kay Kender and Katta Spiel, 2022. “FaceSavr™: Designing technologies with allistic adults to battle emotion echolalia,” CHI EA ’22: Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems, article number 14, pp. 1–8.
doi:, accessed 12 December 2022.

Lorcan Kenny, Caroline Hattersley, Bonnie Molins, Carole Buckley, Carol Povey and Elizabeth Pellicano, 2016. “Which terms should be used to describe autism? Perspectives from the UK autism community,” Autism, volume 20, number 4, pp. 442–462.
doi:, accessed 12 December 2022.

Os Keyes, 2020. “Automating autism: Disability, discourse, and artificial intelligence,” Journal of Sociotechnical Critique, volume 1, number 1, at, accessed 31 January 2021.

Eunjung Kim, 2017. Curative violence: Rehabilitating disability, gender, and sexuality in modern Korea. Durham, N.C.: Duke University Press.
doi:, accessed 12 December 2022.

Robert E. Kohler, 1991. “Drosophila and evolutionary genetics: The moral economy of scientific practice,” History of Science, volume 29, number 4, pp. 335–375.
doi:, accessed 12 December 2022.

Rivka Hillel Lavian and Orly Alshech, 2015. “Assimilating iPads in special education schools,” Journal of International Scientific Publications: Educational Alternatives, volume 13, number 1000013, pp. 411–418.

Cindy Lin and Silvia Lindtner, 2021. “Techniques of use: Confronting value systems of productivity, progress, and usefulness in computing and design,” CHI ’21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, article number 595, pp. 1–16.
doi:, accessed 12 December 2022.

A.B.A. Majeed, 2017. “Roboethics — Making sense of ethical conundrums,” Procedia Computer Science, volume 105, pp. 310–315.
doi:, accessed 12 December 2022.

Rebecca Mallett and Katherine Runswick-Cole, 2016. “The commodification of autism: What’s at stake?” In: Katherine Runswick-Cole, Rebecca Mallett and Sami Timimi (editors). Re-thinking autism: Diagnosis, identity and equality. London: Jessica Kingsley Publishers, pp. 110–131.

Jennifer Mankoff, Gillian R. Hayes and Devva Kasnitz, 2010. “Disability studies as a source of critical inquiry for the field of assistive technology,” ASSETS ’10: Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 3–10.
doi:, accessed 13 October 2021.

Anne McGuire, 2016. War on autism: On the cultural logic of normative violence. Ann Arbor: University of Michigan Press.

Rachel Menzies, 2011. “Developing for autism with user-centred design,” ASSETS ’11: Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 313–314.
doi:, accessed 7 June 2020.

Damian Milton, 2017. “So what exactly is autism?” Autism Education Trust, at, accessed 12 December 2022.

Mia Mingus, 2017. “Access intimacy, interdependence and disability justice,” Leaving Evidence (6 November), at, accessed 3 February 2020.

David T. Mitchell and Sharon L. Snyder, 2015. The biopolitics of disability: Neoliberalism, ablenationalism, and peripheral embodiment. Ann Arbor: University of Michigan Press.

Isobel Moore, 2021. “At the intersection of autism and gender: Personal identities and professional ideas,” at, accessed 12 December 2022.

Masahiro Mori, 1970. “不気味の谷 (Bukimi no tani, the uncanny valley),” Energy, volume 7, number 4, pp. 33–35.

Dinah Murray, 2018. “Monotropism — an interest based account of autism,” In: Fred R. Volkmar (editor). Encyclopedia of autism spectrum disorders. New York: Springer, pp. 1–3.
doi:, accessed 12 December 2022.

Dinah Murray, 2012. “Autism and information technology: Therapy with computers,” In: Stuart Powell and Rita Jordan (editors). Autism and learning. London: Routledge, pp. 88–103.
doi:, accessed 12 December 2022.

Dinah Murray and Mike Lesser, 1998a. “Autism and computing,” at, accessed 6 May 2022.

Dinah Murray and Mike Lesser, 1998b. “Mind as a dynamical system: Implications for autism,” at, accessed 6 May 2022.

Lisa Nakamura, 2014. “Indigenous circuits: Navajo women and the racialization of early electronic manufacture,” American Quarterly, volume 66, number 4, pp. 919–941.
doi:, accessed 12 December 2022.

Jessica Navedo, Amelia Espiritu-Santo and Shameem Ahmed, 2019. “Strength-based ICT design supporting individuals with autism,” ASSETS ’19: Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility, pp. 560–562.
doi:, accessed 3 May 2022.

Ari Ne’eman, 2021. “When disability is defined by behavior, outcome measures should not promote ‘passing’,” AMA Journal of Ethics, volume 23, number 7, at, accessed 12 December 2022.

Mallory Kay Nelson, Ashley Shew and Bethany Stevens, 2019. “Transmobility: Possibilities in cyborg (cripborg) bodies,” Catalyst: Feminism, Theory, Technoscience, volume 5, number 1.
doi:, accessed 12 December 2022.

Sandra Nelson, 2020. “Computers can’t get wet: Queer slippage and play in the rhetoric of computational structure,” dissertation, University of Pittsburgh, at, accessed 12 December 2022.

Nigel Newbutt and Ryan Bradley, 2022. “Using immersive virtual reality with autistic pupils: Moving towards greater inclusion and co-participation through ethical practices,” Journal of Enabling Technologies, volume 16, number 2, pp. 124–140.
doi:, accessed 3 May 2022.

Saoirse Caitlin O’Shea, 2020. “‘I, robot?’ Or how transgender subjects are dehumanised,” Culture and Organization, volume 26, number 1, pp. 1–13.
doi:, accessed 12 December 2022.

Ruth Osorio, 2020. “I am #ActuallyAutistic, hear me tweet: The autist-topoi of autistic activists on Twitter,” Enculturation (26 May), at, accessed 11 March 2021.

Joyojeet Pal, 2017. “CHI4Good or Good4CHI,” CHI EA ’17: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 709–721.
doi:, accessed 30 September 2021.

Sarah Parsons, 2015. “Learning to work together: Designing a multi-user virtual reality game for social collaboration and perspective-taking for children with autism,” International Journal of Child-Computer Interaction, volume 6, pp. 28–38.
doi:, accessed 12 December 2022.

Amy Pearson and Kieran Rose, 2021. “A conceptual analysis of autistic masking: Understanding the narrative of stigma and the illusion of choice,” Autism in Adulthood, volume 3, number 1, pp. 52–60.
doi:, accessed 12 December 2022.

Jasbir K. Puar, 2020. “‘I would rather be a cyborg than a goddess’ Becoming-intersectional in assemblage theory,” In: Carole McCann, Seung-kyung Kim and Emek Ergun (editors). Feminist theory reader. Fifth edition. New York: Routledge, pp. 405–415.
doi:, accessed 14 November 2020.

Leah Lakshmi Piepzna-Samarasinha, 2018. Care work: Dreaming disability justice. Vancouver, B.C.: Arsenal Pulp Press.

Jessica Sage Rauchberg, 2022. “Imagining a neuroqueer technoscience,” Studies in Social Justice, volume 16, number 2, pp. 370–388.
doi:, accessed 12 December 2022.

Kathryn E. Ringland, 2019. “A place to play: The (dis)abled embodied experience for autistic children in online spaces,” CHI ’19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, paper number 288, pp. 1–14.
doi:, accessed 21 May 2020.

Robin Roscigno, 2019. “Neuroqueerness as fugitive practice: Reading against the grain of applied behavioral analysis scholarship,” Educational Studies, volume 55, number 4, pp. 405–419.
doi:, accessed 12 December 2022.

Kieran Rose, 2018. “Masking: I am not OK,” Autistic Advocate (24 July), at, accessed 10 January 2020.

Helen Rottier, Ben Pfingston and Joshua Guberman, 2022. “Ghosts, mice, and robots: DisAppearing the autistic person,” In: Tanya Tichkosky, Elaine Cagulada, Madeleine De Welles and Efrat Gold (editors). DisAppearing: Encounters in disability studies. Toronto: Canadian Scholars, pp. 93–105.

Carrie Sandahl, 2003. “Queering the crip or cripping the queer? Intersections of queer and crip identities in aolo autobiographical performance,” GLQ: A Journal of Lesbian and Gay Studies, volume 9, numbers 1–2, pp. 25–56.

Chela Sandoval, 2000. Methodology of the oppressed. Minneapolis: University of Minnesota Press.

Aileen Herlinda Sandoval-Norton, Gary Shkedy and Dalia Shkedy, 2019. “How much compliance is too much compliance: Is long-term ABA therapy abuse?” Cogent Psychology, volume 6, number 1, 1641258.
doi:, accessed 12 December 2022.

Malini Johar Schueller, 2005. “Analogy and (white) feminist theory: Thinking race and the color of the cyborg body,” Signs, volume 31, number 1, pp. 63–92.
doi:, accessed 12 December 2022.

Kyla Schuller, 2018. The biopolitics of feeling: Race, sex, and science in the nineteenth century. Durham, N.C.: Duke University Press.
doi:, accessed 12 December 2022.

Vered Seidmann, 2021. “On blogs, autistic bloggers, and autistic space,” Information, Communication & Society, volume 24, number 15, pp. 2,277–2,292.
doi:, accessed 12 December 2022.

Nurshamshida Md Shamsudin, Melor Mohd Yunos, Faizah Abdul Majid, Shireena Basree, Syamsul Nor Azlan, Kamarulzaman and Nurfitriah Alias, 2021. “The use of technology to facilitate home-based learning for children with autism: Systematic literature review,” at, accessed 3 May 2022.

Ashley Shew, 2020. “Ableism, technoableism, and future AI,” IEEE Technology and Society Magazine, volume 39, number 1, pp. 40–85.
doi:, accessed 12 December 2022.

Ashley Shew, 2017. “Technoableism, cyborg bodies, and Mars,” Technology and disability (11 November), at, accessed 25 September 2019.

Jim Sinclair, 2012a. “Autism Network International: The development of a community And its culture,” In: Julia Bascom (editor). Loud hands: Autistic people, speaking. Washington, D.C.: Autistic Press, pp. 22–70.

Jim Sinclair, 2012b. “Don’t mourn for us,” Autonomy, the Critical Journal of Interdisciplinary Autism Studies, volume 1, number 1, at, accessed 19 May 2020.

Judy Singer, 2017. NeuroDiversity: The birth of an idea. Lexington, Kentucky: n.p.

Sins Invalid, 2019. Skin, tooth, and bone: The basis of movement is our people. Second edition, at, accessed 12 December 2022.

Katta Spiel and Kathrin Gerling, 2021. “The purpose of play: How HCI games research fails neurodivergent populations,” ACM Transactions on Computer-Human Interaction, volume 28, number 2, pp. 1–40.
doi:, accessed 12 December 2022.

Katta Spiel, Christopher Frauenberger, Os Keyes and Geraldine Fitzpatrick, 2019. “Agency of autistic children in technology research: A critical literature review,” ACM Transactions on Computer-Human Interaction, volume 26, number 6, article number 38, pp. 1–40.
doi:, accessed 12 December 2022.

Katta Spiel, Emeline Brulé, Christopher Frauenberger, Gilles Bailly and Geraldine Fitzpatrick, 2018. “Micro-ethics for participatory design with marginalised children,” PDC ’18: Proceedings of the 15th Participatory Design Conference: Full Papers, volume 1, article number 17, pp. 1–12.
doi:, accessed 26 January 2020.

Allucquère Rosanne Stone, 1995. The war of desire and technology at the close of the mechanical age Cambridge, Mass.: MIT Press.

Lucy Suchman, 2011. “Subject objects,” Feminist Theory, volume 12, number 2, pp. 119–145.
doi:, accessed 12 December 2022.

Lucy Suchman, 2006. Human-machine reconfigurations. Second edition. Cambridge: Cambridge University Press.
doi:, accessed 12 December 2022.

Unbound Books, n.d. “Autistic people and ABA survivors on ABA,” at, accessed 17 April 2021.

Katherine Valencia, Cristian Rusu, Daniela Quiones and Erick Jamet, 2019. “The impact of technology on people with autism spectrum disorder: A systematic literature review,” Sensors, volume 19, number 20, 4485.
doi:, accessed 12 December 2022.

Nick Walker, 2021. “Neuroqueer: An introduction,” at, accessed 18 August 2022.

Varun Warrier, David M. Greenberg, Elizabeth Weir, Clara Buckingham, Paula Smith, Meng-Chuan Lai, Carrie Allison and Simon Baron-Cohen, 2020. “Elevated rates of autism, other neurodevelopmental and psychiatric diagnoses, and autistic traits in transgender and gender-diverse individuals,” Nature Communications, volume 11, number 1 (7 August), article number 3959.
doi:, accessed 12 December 2022.

Jillian Weise, 2018. “Common cyborg,” Granta (24 September), at, accessed 12 December 2022.

Jillian Weise, 2016. “The dawn of the ‘tryborg’,” New York Times (30 November), at, accessed 12 February 2020.

Joseph Weizenbaum, 1983. “ELIZA — A computer program for the study of natural language communication between man and machine,” Communications of the ACM, volume 26, number 1, pp. 23–28.
doi:, accessed 12 December 2022.

Joseph Weizenbaum, 1966. “ELIZA — A computer program for the study of natural language communication between man and machine,” Communications of the ACM, volume 9, number 1, pp. 36–45.
doi:, accessed 12 December 2022.

Abby Wilkerson, 1997. “Ending at the skin: Sexuality and race in feminist theorizing,” Hypatia, volume 12, number 3, pp. 164–173.
doi:, accessed 12 December 2022.

Christine Williams, Barry Wright, Gillian Callaghan and Brian Coughlan, 2002. “Do children with autism learn to read more readily by computer assisted instruction or traditional book methods? A pilot study,” Autism, volume 6, number 1, pp. 71–91.
doi:, accessed 12 December 2022.

Rua M. Williams, 2021. “I, misfit: Empty fortresses, social robots, and peculiar relations in autism research,” Techné: Research in Philosophy and Technology.
doi:, accessed 25 October 2021.

Rua M. Williams and Juan E. Gilbert, 2020. “Perseverations of the academy: A survey of wearable technologies applied to autism intervention,” International Journal of Human-Computer Studies, volume 143, 102485.
doi:, accessed 25 October 2021.

Rua M. Williams and LouAnne E. Boyd, 2019. “Prefigurative politics and passionate witnessing,” ASSETS ’19: Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility, pp. 262–266.
doi:, accessed 26 January 2020.

Rua M. Williams and Juan E. Gilbert, 2019. “Cyborg perspectives on computing research reform,” CHI EA ’19: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, paper number alt13, pp. 1–11.
doi:, accessed 21 June 2021.

Adam Wojciechowski and Raed Al-Musawi, 2017. “Assistive technology application for enhancing social and language skills of young children with autism,” Multimedia Tools and Applications, volume 76, number 4, pp. 5,419–5,439.
doi:, accessed 12 December 2022.

Alice Wong, 2019. “Cyborgs” (18 December), at, accessed 10 December 2020.

Rebecca Wood, 2021. “Autism, intense interests and support in school: From wasted efforts to shared understandings,” Educational Review, volume 73, number 1, pp. 34–54.
doi:, accessed 12 December 2022.

Melanie Yergeau, 2018. Authoring autism: On rhetoric and neurological queerness. Durham, N.C.: Duke University Press.
doi:, accessed 12 December 2022.

Melanie Yergeau, 2013. “Clinically significant disturbance: On theorists who theorize theory of mind,” Disability Studies Quarterly, volume 33, number 4, at, accessed 27 February 2019.

Melanie Yergeau, 2010. “Circle wars: Reshaping the typical autism essay,” Disability Studies Quarterly, volume 30, number 1, at, accessed 5 January 2020.

Anon Ymous, Katta Spiel, Os Keyes, Rua M. Williams, Judith Good, Eva Hornecker and Cynthia L. Bennett, 2020. “‘I am just terrified of my future’ — Epistemic violence in disability related technology research,” CHI EA ’20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–16.
doi:, accessed 21 June 2021.

Alyssa Hillary Zisk and Elizabeth Dalton, 2019. “Augmentative and alternative communication for speaking autistic adults: Overview and recommendations,” Autism in Adulthood, volume 1, number 2, pp. 93–100.
doi:, accessed 12 December 2022.


Editorial history

Received 16 November 2022; accepted 12 December 2022.

Creative Commons License
This paper is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Not robots; Cyborgs — Furthering anti-ableist research in human-computer interaction
by Josh Guberman and Oliver Haimson.
First Monday, Volume 28, Number 1 - 2 January 2023