The (Post-)Digital Condition An Interview with Felix Stalder
First Monday

The Post-Digital Condition An Interview with Felix Stalder

 


 

Felix Stalder teaches digital culture and network theories at the Zurich University of the Arts and works as an independent researcher/organizer with groups such as Institute for New Cultural Technologies and the technopolitics group in Vienna. In his most recent book — The Digital Condition (Polity Press, 2018) — he argues that referentiality, communality, and algorithmicity have become the characteristic cultural forms of today’s digital condition. Felix has been observing the unfolding of social and political developments vis-à-vis digital media technologies since the 1990s. At the end of the 1990s, he was pursuing a Ph.D. at the University of Toronto (completed in 2001), then as a post-doc with the Surveillance Project, a transdisciplinary research initiative based in the Department of Sociology, Queen’s University in Kingston, Ontario, Canada (completed in 2002). Since 1998, he has been moderating the Nettime mailing list together with Ted Byfield.

1) For the upcoming conference “Digital Cultures: Knowledge/Culture/Technology” at the Centre for Digital Cultures (CDC) in Lüneburg [1], you are preparing a talk about the “Digital Condition”. What do you mean by this notion?

The digital condition is what we find ourselves in, now that digitization — the transformation of data and informational processes from analog to digital — has passed its apex. Most social activities — from the singular to the collective, from the private to the public, from the civic to the commercial, from the peaceful to the violent — are already relying on digital infrastructures for sensing, communication, and coordination. One of the things that is particular about this infrastructure is that it is capable of handling much higher volumes of information than previous infrastructures (e.g., print, broadcast, physical libraries). This is both a technical and organisational fact, in the sense that the amount of data created by humans and by machines has been growing exponentially while costs of storage and processing power have been falling. Viewed in purely quantitative terms, there is an exponential curve of the amount of data created every year.

However, more important for my analysis, is the cultural dimension. The sheer amount of data and information, of reference points in circulation, of claims of importance and relevance, are overwhelming existing methods and institutions of culture, that is, established ways of producing shared meaning. Take, for example, evening news on TV. No matter how well they are made, within their 15 minutes of air time they can only cover a very limited number of topics and those only from a very limited number of perspectives. By implication, all the rest is rendered unimportant. Now, for a lot of people, it is no longer acceptable to have their interests and realities labeled as unimportant. As a consequence, they withdraw from the old institutions and mechanisms for creating meaning and are looking for new ones.

However, what is important to note is that this crisis of the institutions of meaning and the desire to develop alternative ones has not been caused by digital technologies. Rather, it is a consequence of multiple trajectories of large-scale social transformations that have existed for more than 50 years. For example, the creation of a knowledge economy has yielded more and more people who are skilled in communication and cultural production, whereas at the same time social liberalization has made it possible for previously suppressed groups and positions to enter the “mainstream” — just think of how more varied the forms of “family” are accepted today than, say, in the mid 1960s. And, as a third major social driver globalization can no longer be thought of in terms of more or less developed countries and as development necessarily following the Western model.

What the Internet has provided, and why it was so enthusiastically adopted by many people and cultural groups, is an infrastructure to deal with the drastically increased volumes of information that this proliferation of speakers and positions produced. Suddenly, it is possible to debate, on scales small and large, about things that would have never appeared in the broadcast media. And it is not just about debating, but also about practical ways to organize one’s life according to different criteria and realities produced by these information flows. The Internet as an infrastructure, in other words, has accelerated existing trends, deepened the existing crises of cultural institutions of the previous era, and is providing the necessary condition for the creation of new ones.

This, I want to stress, is a structural view — neither good or bad, but simply a transformation in how people learn about the world, relate to the world, and how they can act in the world. So, the focus here is on the “how” things are done, not on the “what” is being done.

 

Aram Bartholl, Keepalive
 
Figure 1: Aram Bartholl, Keepalive, 2015. Outdoor sculpture. Rock, steel, router, usb-key, thermoelectric generator, fire, software. Creative Commons by-nc-sa. Project URL: https://arambartholl.com/keepalive/.

 

2) In your new book “The Digital Condition” [2] you argue that the emergence of a new cultural landscape in the West has been shaped by three specific forms of culture: referentiality, communality, and algorithmicity. In what sense are algorithms “cultural”? What are the implications of an “algorithmic culture”?

I use the term “digital condition” in the singular because I see certain patterns in how people orient themselves and act in this changed informational environment. In the book I focus on three cultural forms creating these patterns:

“Referentiality” is a method with which individuals can inscribe themselves into cultural processes and constitute themselves as producers. By filtering certain things out of the chaotic information sphere they, for whatever reason, believe to be worth the attention, their own attention but also the attention of the people they are connected to. One like, one retweet, one share at a time. Understood as shared social meaning, the arena of culture entails that such an undertaking cannot be limited to the individual. Rather, it takes place within a larger framework whose existence and development depend on communal formations. In this “communality” meaning is generated by validating each other’s contributions — putting a like under a shared post — but also by expanding the individual selection of references with those of others, thereby creating a shared horizon of things that is relevant for the group, be that cat pictures, political opinions, conspiracy theories, jihadi propaganda, or K-Pop.

Finally, “algorithmicity” denotes those aspects of cultural processes that are (pre-)arranged by the activities of machines. Algorithms transform the vast quantities of data and information that characterize so many facets of present-day life into scales and formats that can be registered by human perception. It is impossible to read billions of Web sites. We therefore turn to services such as Google’s search algorithms, which reduce the data flood (“big data”) to a manageable amount and translates it into a format that humans can understand (“small data”). Without these algorithms, human beings could not comprehend or do anything within a culture built around digital technologies. Yet, the very same algorithms influence our understanding and activity in an ambivalent way. They create new dependencies by pre-sorting and making the (informational) world available to us, and, simultaneously, shape our agency by providing preconditions that enable us to act.

Thus, we are already living in an algorithmic culture. One dimension is that algorithms are products of culture; they embody values and assumption about what is important and what is irrelevant in order to solve a certain problem (like satisfying user, advertiser, and corporate interests through a search engine). A second is that they themselves are creating culture, in so far as they shape the way we see the word and how we can act in it. And third, they constitute a new source of power, because they unequally enable agency under the condition of large-scale information flows.

3) In light of such large-scale information flows, you speak of the growth of the knowledge economy and the related problem of our limited attentional resources. What kinds of new attentional forms and practices do you see emerging to cope with the ever-increasing information flows? What role will algorithms play in this new “ecology of attention”? [3]

 

Aram Bartholl 3V
 
Figure 2: Aram Bartholl 3V, 2017. Site specific installation. Aluminium, acrylic glass, thermoelectric generator, electronics, LEDs, tea candles, steel chain. Creative Commons by-nc-sa. Project URL: https://arambartholl.com/3v/.

 

I would differentiate two aspects here: First, algorithmic data analysis translates between the scale of the system, for example, in case of the search engine, billions of documents, and the scale of the human, say, 10 documents returned by the search engine on the first page. This is not entirely new. Libraries and archives beyond a certain scale cannot be used without organizing metadata. The Internet as “library” is so vast and dynamic that standard catalogs do not function anymore — they need to become dynamics themselves. Hence, the need for algorithms. They make human attention a valuable resource to begin with, because they allow us to focus attention, rather than be overwhelmed by a chaotic infosphere. Second, because algorithms produce attention, controlling the algorithms means controlling attention, and this is done in a manner than is hard to detect. It’s not about trying to shift attention from one thing to another, but simply pushing things into the space of attention. Whereas everything else remains invisible.

4) Where do you see the political challenges in applying automated filter mechanism? How can we discriminate (i.e., filter information from data) without being discriminatory?

Filtering is necessary, otherwise we drown in information. Already the brain filters out sensory data in order to construct an intelligible world, based on the difference between “things I need to pay attention to” and “things I can ignore”. In the brain, this happens largely unconscious through a combination of genetic and cultural factors. In mass media this is quite explicit. Not only can we point at the entity doing the filtering (the editorial board) and try to hold it to account, but there are competing filters (different news outlets) and the filtering is also restricted to easily recognizable mass media content. But now, things are very different. First, filtering has become ubiquitous, extending into the capillary system of communication. Facebook is filtering which posts of your friends you see. Second, there are very few entities doing the filtering. Within the world of Facebook, it’s only one. When searching online, it’s predominantly Google. This, of course, is very powerful position, because filtering information is part of the construction of the world, certain things are pushed into the focus of attention, while others get ignored. And filtering is never without biases, it cannot be, because there must be certain criteria to determine what is important and what is not. And in the case of social media, these are mostly commercial criteria, reproducing the gender, class and race discrimination that already embedded in, and produced by, the market. Position and attitudes dominant in the market are also dominant in search results (and most other algorithms shaping social behavior). But these results presented as a neutral account of the world, thereby naturalizing discrimination. Safiya Noble made this point very explicitly [4].

The problem here is not filtering itself, but the unaccountability and concentration of power that goes along with it. Therefore, we should not call for “neutral” or “objective” filters. That’s an oxymoron. Rather, we need to create mechanism of accountability (like we had for editorial decisions) and we need to counter the monopoly tendencies. So, rather than having only one filter, we need to be able to compare different filters and thus see how they work. We cannot judge the filtered against the unfiltered view, we can only judge results of one filter against those of another one.

5) While not a week goes by without another former Silicon Valley manager apologizing for “what they have created” [5], the digitization hype in academic institutions seems to be unabated. What do you make out of the urge in “digital humanities” and “digital sociology” to do and build stuff without reflecting on the why?

I think there are at least two main drivers of this in the social sciences and the humanities. One is the promise that by using new methods and new data sources, new insights can be generated. And since we are in the early days of this approach, there are still low-hanging fruits to be collected, so success comes easily. This is, of course, attractive and there is nothing wrong with this. The danger is that since there are still so many easy targets, the value of these new, quantitative methods gets widely overstated, and some people dream of a “social physics” to finally uncover the laws of social motion. In some ways, this harks back to the very beginnings of social science in the mid nineteenth century and the reasons such laws have never been discovered are not only because of the lack of available data, but also for more fundamental reasons, such as the questionable quality of any data (which is, contrary to its etymology not “given” but “made”) or the “interactivity” between scientific research (both as activity and as product) and the subject (society). So, while not discarding the possibility of doing research differently, we need to be aware of the main limitations of quantitative methods that have been well understood since the days of Behaviorism.

The second driver is more prosaic, there is funding attached to digital humanities and digital sociology, and new industry partnerships to be created. This is extremely attractive for humanities and social science departments that are chronically underfunded and have come under lots of pressure in recent years to justify their relevance. Hence, there is a lot of activity that is primarily geared towards satisfying these funding criteria, rather than towards interesting research.

6) In your book you write: “We have reached a point where Leibniz’s distinction between creative mental functions and ‘simple calculations’ is becoming increasingly fuzzy” [6]. According to this statement, do you think that artificial intelligence in the form of machine learning is merely an improved statistical method or does is warrant a deeper rethinking of intelligence?

I think both. On a “material” level, machine learning is just advanced regression analysis, but that is about as correct as saying that, on a “material” level, the brain is just chemistry. If you look at it phenomenologically, the chemistry of the brain produces a “mind” which is quite different from observable output of other chemical processes. The same could be said about advanced statistics. For a long time, it was assumed that the application of “simple calculations” was limited to a particular class of problems. And now, we see that it can be applied to other classes of problems as well and thus the boundary between what machines can do and what they cannot do needs to be redrawn. Where it will run along, we don’t know yet. Perhaps the search for this boundary contribute to replacing the Cartesian model, which splits body and mind, with one that reintegrates the two in a way that will be hard for machines to emulate. If we continue to insist on this split, we might come to a point where the enthusiasts of an artificial intelligence, singularity, and the like already are: all mental processes can be simulated and hence humans can be fully replaced by machines. But this, in my view, is a very impoverished notion of what the human experience is about.

7) In “Updating to Remain the Same” [7] Wendy Chun speaks of network science as the science of neoliberalism. And in your book you mention the socio-economic transformation due to networks. Has the “network” become the central diagram of the twenty-first century? If so, what role do institutions play in the (post-)digital age?

The hallmark of a network is its flexibility, whereas most other institutional forms are defined by a longer duration of the patterns of interaction. What changed through digital technology is that the trade-off between scale and flexibility has more or less disappeared. Social networks always existed, but before the Internet, they couldn’t scale because of the complexity of the internal coordination. Digital technology removed this limit and now we have networks on all scales, competing with other forms of organization and often eliminating them in this process. In this sense, I think Manuel Castells was right to call the network the dominant social morphology of the present. Under the dominance of neoliberalism as a political system, it is not surprising that this flexibility was articulated within a neoliberal framework and that the dominant way of researching networks — network science — is embodying a neoliberal perspective.

 

Aram Bartholl, Obsolete Presence
 
Figure 3: Aram Bartholl, Obsolete Presence, 2017. Site specific installation. 4C print on forex, wood, mirror. Creative Commons by-nc-sa. Project URL: https://arambartholl.com/obsolete-presence/.

 

I think one of the core issues for the next decade of thinking about, and doing, digital media, is how create institutions which are able to combine this flexibility of networking with the longer duration necessary for social and biological processes to unfold. This cannot be done from within digital networks alone, because the technical horizon of the digital is instant change. Ideally, everything that can be done, should be done now. But society doesn’t work like this, and biology doesn’t work like this either. So, how can we think of networks that respect and embody different temporalities? Temporalities of digital media, but also those of social interaction and reproduction, as well as those of biological change which unfolds over very large time scales. If we can do that, networks, and network science, will also cease to be defined by neoliberalism and its relentless privileging of instantaneous market transactions, driven my competition with a short-term focus.

8) You identify two trajectories formatting our current cultural, political, and social situation: post-democracy and commons. Given these opposing forces, which forms of (self-)governance do you see emerging?

The institutions of liberal, representative democracy have been in a crisis for a long time. This has to do with a more general crises of presentation, which is accelerated by the new forms of generating meaning and orientation described above as referentiality, communality, and algorithmicity — all of which are non-representative. Rather, they are performative. The slow loss of legitimacy of these liberal institutions — just think of the extremely low trust in politicians as a profession and parliaments as a system — opens the space for alternatives. One is the reduction of institutions of democracy to empty shells for the sake of efficiency, which I put under the term of “post-democracy”. The other is the expansion and renewal of democracy, which I put terms of the “commons”.

What we are seeing at the moment is that central institutions of liberal democracy are crumbling at an astonishing speed. The race to replace them has kicked into high gear. The main events driving forward a revitalization of authoritarian politics are taking place on a national level, the vote by the United Kingdom to leave the European Union, the election of Donald Trump to the office of President of the United States of America, and the rise of neo-fascists to power in Austria and Italy. The list is depressingly long. Conversely, the main events driving the renewal of democracy are taking on a metropolitan level, namely the emergence of a network of “rebel cities,” led by Barcelona and Madrid. There, community-based social movements established their candidates in the highest offices. These cities are now putting in place practical examples that other cities could emulate and adapt. One of the most important concepts put forward is that of “technological sovereignty”: to bring the technological infrastructure, and its developmental potential, back under the control of those who are using it and are affected by it, that is, the citizens of the metropolis.

9) Does this “technological sovereignty” also include new coalitions between human and non-human actors?

For practical political reasons, I wouldn’t speak about coalitions between human and non-human actors. That’s too theoretical. However, in order to redefine democratic participation, we need new institutions that realize the potential of digital infrastructures for flexible mass participation. It cannot be that democracies work only on the basis of nineteenth century technology, such as pen and paper ballots, or physical assemblies in parliaments. In order to establish and maintain democracies, there must be an experience of solidarity. The experience of alienated work, so central to the working-class movement, no longer binds people together. Nineteenth century nationalism tried to create that through the “imagined community” of the nation state and this is having a full nostalgic revival. The twenty-first century municipal movements work on the level of the experience of a shared locality, the city that is used by all. That’s the opposite of a privatized gated community where you do not need to care about what happens outside the gate, because there is an armed guard at the gate. I don’t think it’s a co-incidence that municipalism is strongest in cities like Barcelona, that are proud of their civic traditions.

10) But what about non-urban areas? Where is the potential for new politics beyond the metropolis?

I think urban areas, where the majority of people are already living anyway, will lead this development, because in the city the necessary innovative capacity is located to make this happen. But since much of the municipal technologies are open source, the technical innovation is also a social and institutional one, so transfers can easily go out of, but also into the urban areas. But there is also the increasing importance of direct cooperation between actors in the global South, sometimes in cities, sometimes outside of it. Here we can see different patterns of development, where rural actors can use and built-upon existing technological infrastructures to bypass (neo-)colonial patterns of exchange, both in terms of knowledge but also increasingly in terms of trade. Here, the network structure, which from a user’s point of view, does not rely on central nodes, but enables direct communication between any nodes, offers a chance to connect and expand indigenous knowledge, not as a way of preserving traditions of the past, but as a way of developing alternative trajectories into the future.

This interview was conducted by Clemens Apprich, author of Technotopia — A Media Genealogy of Net Cultures (Rowman & Littlefield International, 2017) and co-author of Pattern Discrimination (University of Minnesota Press/meson press, 2018). End of article

 

Notes

1. International Conference ”Digital Cultures: Knowledge/Culture/Technology”, Leuphana University of Lüneburg, 19–22 September 2018, co-hosted by the Centre for Digital Cultures/Leuphana University of Lüneburg, and the Institute for Culture and Society/Western Sydney University. You can follow the conference via live-stream: https://digitalculturesconference.org.

2. Stalder, Felix. 2018. The Digital Condition, translated by Valentine A. Pakis. Cambridge: Polity Press.

3. Citton, Yves. 2017. The Ecology of Attention, translated by Barnaby Norman. Cambridge: Polity Press.

4. Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press.

5. See, for example, the account by Noah Kulwin in the New York Magazine: http://nymag.com/selectall/2018/04/an-apology-for-the-internet-from-the-people-who-built-it.html.

6. Stalder 2018, p. 107.

7. Chun, Wendy Hui Kyong. 2017. Updating to Remain the Same. Habitual New Media. Cambridge, Mass.: MIT Press.

 


Editorial history

Received 27 August 2018; accepted 28 August 2018.


Creative Commons License
This interview is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

The (Post-)Digital Condition — An Interview with Felix Stalder.
First Monday, Volume 23, Number 8 - 6 August 2018
http://firstmonday.org/ojs/index.php/fm/article/view/9409/7574
doi: http://dx.doi.org/10.5210/fm.v23i8.9409





A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2018. ISSN 1396-0466.