The typical approach to digital sustainability is related to digital business endurance, the persistence of digital records and objects, digital corporate communication and advertising, governance and knowledge access. In this article we offer a different approach, using an ecological perspective that relates the concepts of “informavore” and platforms as “digital ecosystems”, and human information behavior. We also carry out a domain analysis to follow the evolution of digital sustainability and its main discourses. We find that only one very recent article introduced a definition of digital sustainability that was consistent with the ecological approach. Taking that article as a point of inflexion, we complemented its definition of digital sustainability with some new elements. In conclusion we propose several breakthroughs required to advance the understanding of digital sustainability, and use the informavore-information behavior alternative as a basis for proposing some measures for digital sustainability.
2. Information behavior and the informavore
3. Technology, platforms, and digital ecosystems
4. Adaptation perils in the digital ecosystem
5. Sustainability and the digital
6. Results and interpretation
Digital sustainability is a very recent concept that has been in development since 2002, as this study will show. The typical approach to this field has been related to digital business endurance, the persistence of digital records and objects, digital corporate communication and advertising, governance, and knowledge access. Unfortunately, such perspectives have not moved the field forward, but rather have replicated old paradigms about technology and sustainability.
In this article we offer a different approach, using an ecological perspective that relates the concepts of “informavore” and of platforms as “digital ecosystems”, all articulated through the field of human information behavior. We also produce a domain analysis to examine previous discourses and find points of agreement with this proposal. Recent studies have brought new ideas about the digital as governance for sustainability and knowledge/technology access, but only one recent article has introduced a definition of digital sustainability that was consistent with an ecological approach.
Taking that article as a point of inflexion, we have complemented its definition with new elements, and proposed several advances in the understanding of digital sustainability. The ecological perspective about informavores and information behavior within the digital ecosystem can give new air to this debate, so as an example, we used this perspective to propose some measures for digital sustainability.
2. Information behavior and the informavore
Humans are informational animals. Our species seeks, gathers, exchanges, consumes, and applies information to resolve our relationship with nature and ourselves. It is our basic adaptive toolkit. Humans are information foragers, permanently feeding upon a self-created information environment and transforming it into ideas and convictions that shape our collective actions and adaptation (Brockman, 2009; Pirolli, 2007; Pirolli and Card, 1999).
The information-foraging theory points out the similarities between human information seeking-and-consumption patterns and animal food-foraging strategies. The behavioral ecology of feeding is an optimization process: there must be a positive relationship between the energy provided by food and the energy invested in seeking, selecting, capturing, and consuming it. Information-foraging theory argues that the same applies to the acquisition of information.
To maximize fitness, the organism develops behaviors that ensure the most benefit at the least cost. Understanding such a pattern helps to predict the sustainability of the individual’s adaptive strategy and its efficiency. Traditionally, it was assumed that the most economical pattern would optimize the outcome. Informavores, as described by Pirolli, would follow this pattern.
Pirolli’s observations matched with Zipf’s principle of least effort, a classical model from library and information science (LIS) about human information ecology. This states that information behavior responds to a principle of energy saving, so in general, we will always choose the easiest path in seeking information (Adamic and Huberman, 2002). Mooers (1986) extended this notion to computing systems, noting that users will tend to use them to the extent that they feel they are not too demanding.
Following the same path, the technology acceptance model (TAM) applied the concept of “perceived ease of use” as a condition for technology appropriation (Davis, 1989; Venkatesh, 2000). Zipf’s least effort principle has been successfully used to explain the distribution in Internet data transmission, how we select Web sites, content and sources and bias in data collection (Adamic and Huberman, 2002; Baeza-Yates, 2018).
But does the most economical information behavior ensure sufficient quality of information and decisions to grant sustainability in the long term? To answer this, we need to complement the informavore model with other characteristics that have been researched by LIS within the field of information behavior.
This area of research began in the 1980s, when the term “information behavior” was coined by Thomas D. Wilson (2000), referring to actions related to seeking, selecting, and using sources and channels of information, in an active or passive way. Asking why we seek and use information refers us to information needs. Two main drivers have been proposed: cognitive needs and gratification.
The cognitive approach states that when the subject faces a problem that cannot be solved using previous cognition, this triggers a loop of seeking, retrieving, and testing, until a suitable solution is reached. This driver has been named “anomalous states of knowledge” (Belkin, 1980), “uncertainty” (Ford, 2004; Kuhlthau, 1993), or “cognitive dissonance” (Festinger, 1957; Harmon-Jones and Harmon-Jones, 2007; Harmon-Jones and Mills, 2019).
The cognitive approach evolved into a multifaceted perspective that proposes sense-making as the central incentive for information consumption. There is a constant necessity for situational awareness, that is, building an understandable order in regard to space, time, society and culture that can be projected into the future (Dervin, 2015, 1998; Savolainen, 1995). However, other gratifications are also implied.
Gratification models imply that people (henceforth referred to as “informavores”), mainly those less qualified and skilled, would be less willing to explore multiple information sources, and would confine themselves to those which are accessible, familiar, and offer immediate answers to their concerns. The degree of familiarity and acceptability of the information would be judged by the family, reference groups, microculture, and tribe. Being an active or passive informavore is defined by the social group, creating what Chatman called informational “small worlds” (Chatman, 1999, 1991; Dawson and Chatman, 2001).
An additional contribution of LIS in defining human information behavior has been to explore the role of emotion in information seeking and consumption. Early models about information behavior focused on the cognitive, following a rationalist bias that neglected emotion (Nahl and Bilal, 2007). Emotional/affective issues affecting information behavior include variables such as preference, attitude, motivation, expectancies, uncertainty, self-efficacy, optimism, relevance, satisfaction, acceptance, loyalty, and anxiety (Julien, et al., 2011, 2005; Nahl, 2004).
The characteristics provided by Pirolli’s model and LIS research, define information behavior as an adaptive toolkit used by the human informavore, with the following characteristics:
- Information seeking and processing as a central adaptation mechanism; we are informational animals, discursive beings that live within authoritative narratives that address “reality”.
- Information behavior follows an optimization tendency that maximizes fitness by avoiding extensive information-seeking or highly complex narratives. As stated by Herbert Simon’s (1990) “bounded rationality” experimental work, human reasoning is limited by competence, interest, available information, and time. In information and decision terms, we live with “just enough” rather than what is “very good.”
- Cognitive dissonance is a main driver of information behavior; sense-making/situational awareness is a constant and pervasive need, so urgent that dissolving dissonance may bond our judgement to partial conclusions. Kahneman and Tversky experimentally found that risk aversion and uncertainty imposed limitations on information interpretation and decision-making in the form of anchoring (bias), error, and contradiction (Kahneman and Tversky, 1979; Tversky and Kahneman, 1974).
- Prevalence of gratification in the form of accessible, familiar and immediate answers, defining what kind of information will be ingested.
- Reference groups and their microcultures will define and filter what information is familiar and acceptable, and what level of information-seeking activity and consumption is tolerable, and impose these perceptions on the individual, reinforcing group thinking.
- Emotion will strongly drive information behavior bias in seeking and consumption, based on preferences, attitudes, motivation, expectancies, uncertainty, self-efficacy, optimism, relevance, satisfaction, acceptance, loyalty, and anxiety. Altered emotional states produced by high uncertainty and social pressure regulate the informavore’s seeking and ingestion of information.
This approach to human information behavior enables us to understand the informavore’s limitations, escaping the almost eschatological debate about human “rationality” as something that escapes natural laws and animal limitations. Our information behavior tends to optimize and stay within a comfort zone, rather than ensure detail and reliability. As informavores act in herds, reference groups filter and guide what we seek, gather, and consume; emotion binds the structure and affects its direction.
With the evolution to the information society and big data, the information ecosystem has turned into an environment that requires permanent and competent decisions about what information to consume, amid massive volumes of content that change by the second. Understanding how the digital informavore behaves has become critical, as the virtual ecosystem includes predators, and mixes nutritious with poisonous forage.
3. Technology, platforms, and digital ecosystems
If humans have been successful in colonizing and reproducing themselves in almost all habitats on Earth, what might be wrong with this information behavior optimization tendency in adaptive terms?
A key feature of human adaptability is technology; we are technological beings that create complex tools as no other animal does (Aunger, 2010). Technology — the application of knowledge for practical purposes — is a permanent human activity, so that technological devices are symbiotic with ourselves. During the second half of the twentieth century, the pioneers of computing science and industry named this phenomenon the “man-computer symbiosis” (Licklider, 1960). More recent expressions include “man-machine symbiosis” and “human-machine symbiosis.”
Our adaptation to ecosystems has involved a sequence of evolving tools that mediate our activities, making technology a “natural” extension of our body and nervous system. As entire societies appropriate these devices, they change our collective relationship with the environment (Logan, 2010; McLuhan, 1994, 1977; McLuhan and Fiore, 1996; Strate, 2008).
This symbiosis with technology, or “mediation” as a social and contextual articulator, implies the consideration of skills, literacies, and competence involved in mastering such symbiosis, as determining the adaptive success of an individual or group into a society or a natural environment.
The permanent application of technology in what we do and what we are has blurred the boundaries between what is synthetic and what is human (Serres, 2003). Foucault and Deleuze would say that technologies, through “dispositives” (devices), discipline us about what we can perceive and express; our collective shared thinking and personal subjectivity; the rhythm of our lives and how we integrate them into each other (Deleuze, 1992; Foucault, 2000). We never know to what extent we are the machine or the machine is us.
Examples of this include the traffic lights in a city: how many people’s lives are synchronized by this technology? E-calendars and task apps control our routines, giving power to others to schedule our time. What is the extent of our memory, if we include all the hard and cloud drives in our life? How do we remember? Is it by following algorithms that recover memories such as photographs or music?
Technological transformation reframes our way of symbolizing nature and transforms our representation of the universe and ourselves. Our collective memories and what makes sense to us are changed. Interests and values are implanted within such symbolic representations, so that technologies also embody judgements about what should be (Feenberg, 2009).
How do we remember our children? Is it in the way that Facebook organizes our memories and presents them to us each time we enter the platform? How do we dress when going out? As is consistent with the AccuWeather report of the day? How do we judge people? Maybe in the way that Tinder has taught us to do? Multiply these examples by entire generations and you will see the extent to which technology reframes our representation of the universe and ourselves.
As performance and profit have been central interests in the development of knowledge and technology in our recent history, performativity — using language as a form of social action that controls change and defines identity — has legitimized and inserted these ideas deeply in our current technological representation of the world (Ellul, 1978, 1964; Lea and Nicoll, 2002; Lyotard, 2004; MacKenzie and Wajcman, 1985; Muniesa, 2014).
As information technologies (IT) extended their senses and fed their brains, human informavores became symbiotic with them. When algorithms took control of words and the ideas they express, technological devices became discursive machines (Berrío-Zapata, et al., 2015). Mechanical gears were replaced by electronic algorithms, and then gave way to operating systems. When the network society was born, operating systems were translated into global platforms. In the information society of the twenty-first century, digital platforms such as Google, Apple, Amazon, and Facebook carried out a sociotechnical modulation on software and its users, with a wide range of socio-cultural effects at all levels (IT for Change, 2017).
Digital platforms have enhanced data homogenization and interoperability between multiple settings, i.e., API interfaces, artificial intelligence, big data, and cloud computing. They have articulated software over such a massive audience at global level that these ensembles of innumerable technical artifacts have turned into ecosystems (Bosch, 2009; de Reuver, et al., 2018; Manikas and Hansen, 2013).
This global digital ecosystem is the new synthetic habitat of the human informavore. Informavores also became man-machine symbioses, foraging for information in tailor-made niches with fresh, unlimited, and attractive content. Platforms optimized information rumination by simplifying seeking processes, efficiency, integration, interactivity, and rapidity (Rodrigues, 2017; Santaella and Lemos, 2010).
But the digital ecosystems of platforms also represent a governance that alters and aligns the political, economic and cultural scenario of the informavore community. This governance facilitates or hinders certain practices and distributions of power at global level (Kenney and Zysman, 2016; Tiwana, 2014). The discipline of algorithms reconfigures the ecosystem, its actors, and their forms of operation (Cohen, 2017; Tiwana, et al., 2010).
Niches are specific ensembles of physical and environmental conditions serving a set of beings, including their interactions for survival and thriving (Pocheville, 2015). In information terms, a niche serves differentiated audiences, content creators, and functionalities for a distinct public. It is a space for informational gratification that sustains the particular and differentiated narrative of specific communities (Dimmick, 2014; Yousuf, 2016).
Algorithms produce niches in digital ecosystems, where informavores maximize optimization. Platforms are designed to please their niche hosts’ desires and capture their attention, using gratification techniques that affect their emotions and cognition. Algorithmic discipline feeds dependency upon niches, making such fidelity a strategic asset, controlled by the owners of the platform (Adner and Kapoor, 2010; Gawer, 2014; Gawer and Cusumano, 2014; Sakuda, 2016).
Like ants closed inside their formicary, informavores become trapped in the comfort-zone provided by their niches, so that the synthetic becomes more real than the real (Serres, 2003). Digital ecosystems translate into socioeconomic, political and cultural control; power distribution does not serve the promotion of justice or social inclusion, but profit. Digital ecosystems become social disciplining devices for the benefit of dominant actors (Deleuze, 2006; Fuchs, 2009; Gurumurthy and Bharthur, 2018).
4. Adaptation perils in the digital ecosystem
We have defined and built upon the concepts of informavore and information behavior. We have explained the symbiosis between humans and IT and described how digital platforms have produced a new information ecosystem, whose design and governance benefits dominant actors and focuses on captivating audiences for profit. Digital niches have turned into comfort zones, keeping the retrieval and consumption of information easy, fast, and complacent. In terms of optimization we live in heaven, as information foraging has never been so easy. However, heaven often comes at a cost:
- Hyperconnectivity and big-data floods of information have produced information anxiety, hampering effective rumination due to info-intoxication and stronger cognitive dissonance (Bawden and Robinson, 2009; Blundell and Lambert, 2014; Katopol, 2012, 2010; Ojo, 2016; Wurman, 1990).
- As algorithms have fractured audiences into niches to preserve their comfort zones, they have also created echo chambers, in which complacent narratives reverberate (Barberá, et al., 2015; Oxford University Press, 2020). Informavores have become incapable of foraging diversified information or feeding outside their niches.
- Permanent, repetitive, and intrusive messages hammering out all kinds of discourses have become normal, as they have been automated by robots and artificial intelligence (Rodgers and Thorson, 2017; Truong and Simmons, 2010). Discourse marketing has become a global persuasion industry that forces ideas into entire communities, using repetition, pervasiveness, and psychological targeting.
- Massive monitoring of online behavior and digital footprint drifts between law enforcement and totalitarian control (Kalathil and Boas, 2001; Morozov, 2011; Zuboff, 2013). In either case, such monitoring contributes to the surgical efficiency of psychological targeting and discourse marketing.
- The digital ecosystem has turned into a habitat where audiences (informavore herds) are hunted and, using algorithmic filtering bubbles, confined in comfortable niches, where they are gratified by whatever fits them best (Spohr, 2017) to serve the profit interest of the platform.
These characteristics drain human adaptive capacity as a whole, as they prevent the development of collective intelligence (Malone and Bernstein, 2015) in using the digital. Collective intelligent groups who gather and share knowledge, data, and skills to solve urgent problems are a minority who struggle to compensate for poor education and low informational skills of the majority.
Most informavores are victims of their own information behavior flaws, enhanced by the complexity of the ecosystem and the persistent siege of algorithms that seeks to entice them into small worlds to reinforce simplistic and familiar discourses, so that they will feel comfortable and consume. The biggest example of this problem is the wasted digital potential for becoming informed, cooperating and coordinating shown in the erratic information behavior of citizens in response to the COVID-19 pandemic.
In the face of a global state of uncertainty about an unknown virus, entire populations took the option of retrieving and consuming fake news and conspiracy theories that supported negation. Emotional distress made people believe that 5G telecom towers would cause the virus (Sorkin, 2020), that they could be cured by drinking bleach (Smith-Schoenwalder, 2020), or that vaccines would inject microchips into people to control them (Lee, 2021). Misinformation about COVID-19 cost at least 800 lives and injured an estimated 5,800 people around the world (Coleman, 2020).
False cures such as hydroxychloroquine gained sufficient popularity and legitimacy to be purchased in massive quantities by governments. What would have happened if COVID-19 had been even more transmissible and lethal, like smallpox or Ebola? Would human informavores have been able to inform themselves in a sufficiently fast and efficient way as to be able to act collectively in a knowledgeable, disciplined, and coordinated manner to control an outbreak?
Reliable, detailed, and timely information feeds quality decisions and makes possible the articulation of efforts. In a complex ecosystem like the digital one, this is not easily achieved. It demands effort, time, persistence, and various skills, including reading comprehension, technological literacy, and competence with information and data. Such skills are not developed by most informavores, except when education and training reduce the influence of natural biases and limitations. That is why, if we wish to have high-quality information foraging, optimization is not a good policy.
Nonetheless, what is wrong with optimization of information behavior if humans have been successful in almost all other habitats on Earth? The problem is that we have never before had to deal with such a huge and complex information ecosystem. This global synthetic “reality” that defines our relationship with the Earth’s natural habitat has affected the behavior of the majority of humans. The COVID-19 crisis provided a sample of its dark side.
We need to ask ourselves: what does it mean to make humans and their symbiosis with information technology sustainable in time, so that the digital global ecosystem remains adaptive to the natural ecosystem, not turning against human sustainability itself?
To address this question, which is central to this work, we reviewed the concept of sustainability in relation to the digital, to find out what alternatives have been discussed within the scientific community in regard to the meaning of digital sustainability.
5. Sustainability and the digital
To produce this survey, we used domain analysis (Hjørland, 2017, 2002; Hjørland and Albrechtsen, 1995), as this method studies knowledge domains as thought/discursive communities that embody the social advance of ideas and practices. Discourse can be defined as systems of thought — ideas, attitudes, beliefs, and practices — that become authoritative, dominant, and natural, and in so doing, they construct the human subject and its world (Lessa, 2006). Discourse creates accepted knowledge, social practice, objectivity, subjectivity, and power relations.
The empirical identification of discourses was made by applying a content and bibliometric analysis to the literature review. The collection of literature was produced by searches using the terms “digital” AND “sustainability”, with Web of Science (WOS), Scopus, and Google Academic.
As our definition of sustainability, we took the statement made by the UN’s Brundtland Report in 1987: “development that meets the needs of the present without compromising the ability of future generations to meet their own needs” (Brundtland, 1987). Sustainability can be understood as the behavioral capacity of a subject/community, of granting the future stock of resources by using them knowledgeably and rationally in the present.
For the meaning of “digital” we took the general understanding of the term, referring to any electronic binary machines and networks that use algorithms to aid, automate, and connect people’s information seeking and processing: in short, computers connected via the Internet.
WOS and Scopus together reported only 31 publications up to the present; with the addition of Google Scholar, the count rose to 52. This is a field that began in 2002, producing an average of two articles per year, until 2020, when it leapt to seven. There is a low concentration of authorship, with no visible authorities so far. The average number of citations per document is eight, indicating that the domain still does not have significant visibility. Almost all the publications are from European institutions.
The first work referring to “digital sustainability” suggested that the advance of the digital economy in Europe would release pressure on the environment (Alakeson and Wilsdon, 2002). In 2007, Bradley defined digital sustainability in terms of record preservation (Bradley, 2007). After this, the different discourses about digital sustainability can be grouped into the following categories:
- Digital business sustainability and sustainable digital entrepreneurship.
- Digital preservation of records and historical objects.
- Security measures to protect digital documents.
- Sustainability of digital artifacts, platforms and ecosystems.
- Sustainable digital communication and advertising.
- Digital governance as a tool to promote sustainability.
- Sustainable access to knowledge and knowledge creation for sustainability.
Only one recent work advances what we considered a more ample discussion. Pan and Zhang (2020) describe how COVID-19 has reframed our ideas about sustainability in digital terms, listing the following topics:
- Information systems neglected to balance research on the positive impacts of IT with its adverse outcomes. For example, surveillance technologies, so useful for tracking patients infected by COVID-19, have not been strongly researched when it comes to protecting the privacy of individuals by controlling those carrying out the surveillance.
- Infodemic bursts — information flooding without quality control, creating harsh information-seeking conditions — created by global threats that produce high cognitive dissonance which is fertile soil for fake news and conspiracy theories.
- Complementary and coordinated systems that can “orchestrate data ecosystems” in efficient and reliable ways. Infodemics are facilitated by multiple uncoordinated and unmonitored sources, elevating the complexity of information seeking.
- Preparing citizens to face “new normals”, complex information ecosystems and infodemic chaos, so that they can seek, evaluate and share in an efficient and skilled way. How to detect their weak spots and reinforce them, and how to develop resilient information behavior for times of crisis.
- Teach and help people to develop their digital workplace so they can secure their jobs and income without risking their lives. What do we lack in terms of infrastructure and access? How do we reinforce people’s creativity, their spirits, and their capacity to act and protect themselves online?
This list results in a definition of digital sustainability: “the convergence of digital and sustainability imperatives that involves a trans-disciplinary approach of deploying digital technologies in tackling sustainability issues” (Pan and Zhang, 2020).
6. Results and interpretation
Sustainability is said to have three pillars: the economic, the ecological and the social (Brundtland, 1987; Bunnell and Sonntag, 2000; Wellner, 2019).
Discourses 1 to 5 focus on the economic: the conservation of technology itself, or its economic exploitation. This discourse assumes the ecological benefit of the digital, and the neutrality of the ecosystems that technology provides.
Discourse 6 considers the digital as a governance tool to promote sustainable practices through design, i.e., the creation of smart factories and cities. From this perspective, new generations of control systems will solve the problem, as a control mechanism will disclose non-sustainable practices (Gebhardt, 2017; Roblek, et al., 2016). This view neglects to consider that technology creators and their societies must first change, otherwise old paradigms will keep infusing new technologies with old values.
Discourse 7 addresses sustainable access to knowledge and knowledge creation for sustainability. It accepts that the harm to the environment of digital technologies has been underestimated, although their impact in this area remains unclear (Lee, 2007). It argues that asymmetries of access to knowledge and technology in a digital-based economy are a central problem that must be solved if digital sustainability is to be addressed seriously (Aurongzeb, 2020). Knowledge acts as a link to ecological protection by clarifying the consequences of non-sustainability, on the assumption that IT users and creators are rational.
All these discourses treat digital sustainability as separate from the human capacity to master IT and discuss how such competence — or lack of it — may undermine our adaptive success, not only in the digital ecosystem but also in the natural one.
This review shows that we are only starting to attain a clear understanding of the implications of digital sustainability. There seems to be an excess of confidence in the capacity of IT to solve the problem, but the way in which technologies are conceived and framed inhibits their potential for environmental sustainability (Alvarez-Pereira, 2019).
Our understanding of sustainability is distorted by our feeling that humans are not subject to natural laws. Our analysis tends to be self-referential, presuming that the world is organized according to our needs, so that we confuse our social and synthetic ecosystems with those of the natural world. We are more aware than we were of our footprints in the biosphere, but not yet aware enough (Bunnell and Sonntag, 2000).
Digital technologies are still regarded as means and not central elements of our behavior and ecosystems. These technologies are playing a larger role in our lives and environment (Wellner, 2019), and are affecting all the three pillars of sustainability, as they affect the nature of the human informavore.
Population growth, increasing concentration of technological power and unsustainable economic practices have led us to a breaking point: if we do not release the pressure on the planet’s ecosystem, there will be no future. Defining digital sustainability has become crucial for that endeavor, but this field is recent and requires some breakthroughs:
- Technology is not only about technical and economic performance; it extends to the social, the cultural and to human nature itself.
- There is no neutrality in technology; it is a human product.
- There are no clear boundaries in the bond between technology and humans; technological devices are extensions that change how we perceive, process, represent and act at the individual and social level.
- As information behavior is a central toolkit for human adaptation, IT is symbiotic and determines how we inform ourselves and define our actions for survival, and therefore our sustainability.
- We are just another species on the planet that needs to solve the puzzle of its survival. Understanding our inherent limitations in informational terms will prevent our technological symbiosis turning against us.
- The value of an ecological perspective as presented here is its return to basics, avoiding ideological or anthropocentric debates. Conflict, domination, restraints, and flaws all are natural elements of adaptive attempts to find sustainability.
We presented the informavore, ruminating in niches scattered through the digital ecosystem, to draw attention to its information behavioral flaws and ecological situation:
- Without careful assessment of the consequences, we will tend to choose the easiest path to information.
- The seeking, retrieval, and consumption of information (information behavior) is a permanent activity driven by sense-making needs, social acceptability and emotional gratification.
- Information behavior is biased, bonding our representations of the universe and ourselves to optimized non-rational judgements, in line with the law of least effort.
- The creation of global digital ecosystems has been driven by interest, mainly of profit; for this, audiences need to be captivated and kept allegiant. To ensure this happens, informavore herds are fed in simplistic, complacent, and emotionally binding niches.
- These tailor-made comfort zones are filter bubbles, nourishing the inability to feed upon complex and heterogeneous sources, to accept alternative discourses, and to validate content.
The algorithmic reality of these niches has made them “more real than the real”, stimulating consumerism, leading to banality, and overlooking an increasing tolerance for polarization, negation, and conspiracy theories. Niche discourse-marketing based on psychological profiling has become a growing industry with dubious ethical limits, at the service of anyone who can pay for it.
As a result, human informavores have been monitored, manipulated, and rendered unable to exploit the digital as a tool for collective intelligence and adaptiveness. Comfort and dependence benefit the owners of the platforms and other dominant groups, but not the users.
The COVID-19 pandemic was an acid test for competent information behavior on the global scale. On the one hand, small communities of scientists proved the strength of digital collective intelligence in developing and sharing knowledge to combat the virus. On the other hand, the average global citizen was unable to filter discourses of some leaders and charlatans, at the cost of many lives. The ecological perspective of the user (the informavore) related to the technology (its symbiosis) and the ecosystem (digital global platforms) made this outcome predictable, and requires countermeasures to minimize such weaknesses:
- To grant access to digital technology without access to knowledge is to feed the digital ecosystem with prey and promote lack of adaptiveness. Access to knowledge is incomplete without training in information and data competence — the ability to identify needs, seek, retrieve, evaluate, organize, store, interpret, discuss, and communicate information (Institute for Information Competence and Information Infrastructure [IICIIS], 2017). Permanent training is the only way to counteract the law of least effort.
- The relationship between the development of the digital ecosystem and consumerism, psychological profiling, planned obsolescence, undisclosed monitoring, radical speech, bullying, and any other attempt to place the individual interest over the general interest, must be studied, characterized, and made subject to international legal control.
- Infodemic control and the reconfiguring of public and scientific sites to produce friendly and understandable sources of information, in coordination with other high-quality sources.
- The research and development of technology capable of identifying, tracking, and blocking digital sources dedicated to fake news, cyberbullying, and hate speech. Fact checking and content analysis should provide an empirical base for criminalizing these activities internationally and making those who abuse cyberspace accountable.
- As we know that “new normals” will be more frequent in the future, and knowing the natural fragility of the informavore, citizen digital training should become a central activity. This should teach how to handle cognitive dissonance and altered emotional states when informing ourselves, and prepare a home office or safe workplace. At the same time, universal access should be provided to infrastructure and services that will keep the economy running while reducing risk.
Figure 1: Digital sustainability from a human ecology perspective, as elaborated by the authors.
As sustainability implies three pillars — economic, social, and ecological — digital sustainability contains at least four areas of development:
- Sustainability of digital machines and infrastructure (battery disposal, electronic waste, energy consumption).
- The digital as a source of information and education about sustainability.
- The digital as a governance tool to monitor and control unsustainable behavior in organizations and individuals.
- The digital-human symbiosis, to research natural tendencies which impair information behavior and need to be upgraded to achieve information competence. Skills development, literacy access, and competence formation.
Digital sustainability must study the informavore’s information behavior as part of an ensemble between the ecological ecosystem and the human synthetic ecosystem, integrating the socio/cultural, the economical, and the technological.
From the perspective of the informavore’s digital technology symbiosis, digital sustainability can be defined as the development of competent skills that turn us into actors who are knowledgeable about the complex challenges and actions required for general sustainability, giving us the capacity to act as a single global intelligence, coordinating and complementing efforts and talents to respond to any action that might compromise the planet’s sustainability.
About the authors
Cristian Berrío-Zapata is Professor at the Post Graduation of Information Science and the Faculty of Archival Science (Institute of Applied Social Sciences) at the Federal University of Pará (UFPa), Brazil. He is currently acting as postdoctoral researcher at the Department of Library and Documentation at Carlos III University (UC3M), Spain, with a project about digital sustainability in the Amazon.
E-mail: berriozapata [at] ufpa [dot] br
Ester Ferreira da Silva is a M.A. graduate from the Post Graduation of Information Science at Federal University of Pará (UFPA), Brazil.
E-mail: ester [dot] f [dot] da [dot] silva [at] gmail [dot] com
Marise Teles Condurú is Professor at the Post Graduation of Information Science and the Faculty of Library Science (Institute of Applied Social Sciences), and the Postgraduation in Natural Resources Management and Local Development in the Amazon (Environmental Research Center), both at the Federal University of Pará (UFPa), Brazil.
E-mail: marise [at] ufpa [dot] br
This article was supported by project Rede Transamazônica de Cooperação em Informação e Conhecimento para o Desenvolvimento Sustentável (RTCIC-SD), sponsored by CAPES PROCAD AMAZÕNIA, and the Postgraduation in Information Science at UFPA.
Lada A. Adamic and Bernardo A. Huberman, 2002. “Zipf’s law and the Internet,” Glottometrics, volume 3, number 1, pp. 143–150, and at https://www.hpl.hp.com/research/idl/papers/ranking/adamicglottometrics.pdf, accessed 11 October 2021.
Ron Adner and Rahul Kapoor, 2010. “Value creation in innovation ecosystems: How the structure of technological interdependence affects firm performance in new technology generations,” Strategic Management Journal, volume 31, number 3, pp. 306–333.
doi: https://doi.org/10.1002/smj.821, accessed 11 October 2021.
Vidhya Alakeson and James Wilsdon, 2002. “Digital sustainability in Europe,” Journal of Industrial Ecology, volume 6, number 2, pp. 10–12.
doi: https://doi.org/10.1162/108819802763471744, accessed 11 October 2021.
Carlos Alvarez-Pereira, 2019. “Anticipations of digital sustainability: Self-delusions, disappointments and expectations,” In: Roberto Poli and Marco Valerio (editors). Anticipation, agency and complexity. Cham, Switzerland: Springer International, pp. 99–120.
doi: https://doi.org/10.1007/978-3-030-03623-2_7, accessed 11 October 2021.
Robert Aunger, 2010. “What’s special about human technology?” Cambridge Journal of Economics, volume 34, number 1, pp. 115–123.
doi: https://doi.org/10.1093/cje/bep018, accessed 11 October 2021.
Deeder Aurongzeb, 2020. “Digital sustainability: A high dimensional ML problem,” Bulletin of the American Physical Society, volume 65, number 1, at https://meetings.aps.org/Meeting/MAR20/Session/M71.273, accessed 11 October 2021.
Ricardo Baeza-Yates, 2018. “Bias on the Web,” Communications of the ACM, volume 61, number 6, pp. 54–61.
doi: https://doi.org/10.1145/3209581, accessed 11 October 2021.
David Bawden and Lyn Robinson, 2009. “The dark side of information: Overload, anxiety and other paradoxes and pathologies,” Journal of Information Science, volume 35, number 2, pp. 180–191.
doi: https://doi.org/10.1177/0165551508095781, accessed 11 October 2021.
Nicholas J. Belkin, 1980. “Anomalous states of knowledge as a basis for information retrieval,” Canadian Journal of Information Science, volume 5, number 1, pp. 133–143.
Cristian Berrío-Zapata, Fábio Mosso Moreira, and Ricardo César Gonçalves Sant’Ana, 2015. “Barthes’ rhetorical machine: Mythology and connotation in the digital networks,” Bakhtiniana, volume 10, number 2, pp. 147–170.
doi: http://dx.doi.org/10.1590/2176-457321910, accessed 11 October 2021.
Shelley Blundell and Frank Lambert, 2014. “Information anxiety from the undergraduate student perspective: A pilot study of second-semester freshmen,” Journal of Education for Library and Information Science, volume 55, number 1, pp. 261–273.
Jan Bosch, 2009. “From software product lines to software ecosystems,” SPLC ’09: Proceedings of the 13th International Software Product Line Conference, pp. 111–119.
Kevin Bradley, 2007. “Defining digital sustainability,” Library Trends, volume 56, number 1, pp. 148–163.
doi: http://dx.doi.org/10.1353/lib.2007.0044, accessed 11 October 2021.
John Brockman, 2009. “The age of the informavore,” Edge (25 October), at https://www.edge.org/conversation/frank_schirrmacher-the-age-of-the-informavore, accessed 11 October 2021.
Gro Harlem Brundtland, 1987. Report of the World Commission on Environment and Development: Our common future. New York: United Nations, and at https://sustainabledevelopment.un.org/content/documents/5987our-common-future.pdf, accessed 11 October 2021.
Pille Bunnell and Nicholas Sonntag, 2000. “Becoming a sustainable species,” Reflections: The SoL Journal on Knowledge, Learning, and Change, volume 1, number 4, pp. 66–72.
Elfreda A. Chatman, 1999. “A theory of life in the round,” Journal of the American Society for Information Science, volume 50, number 3, pp. 207–217.
Elfreda A. Chatman, 1991. “Life in a small world: Applicability of gratification theory to information-seeking behavior,” Journal of the American Society for Information Science, volume 42, number 6, pp. 438–449.
Julie E. Cohen, 2017. “Law for the platform economy,” University of California Davis Law Review, volume 51, pp. 133–204, and at https://lawreview.law.ucdavis.edu/issues/51/1/Symposium/51-1_Cohen.pdf, accessed 11 October 2021.
Alistair Coleman, 2020. “‘Hundreds dead’ because of COVID-19 misinformation,” BBC News (12 August), at https://www.bbc.com/news/world-53755067, accessed 11 October 2021.
E. Murell Dawson and Elfreda A. Chatman, 2001. “Reference group theory with implications for information studies: A theoretical essay,” Information Research, volume 6, number 3, at http://www.informationr.net/ir/6-3/paper105.html, accessed 11 October 2021.
Mark de Reuver, Carsten Sørensen, and Rahul C. Basole, 2018. “The digital platform: A research agenda,” Journal of Information Technology, volume 33, number 2, pp. 124–135.
doi: https://doi.org/10.1057/s41265-016-0033-3, accessed 11 October 2021.
Gilles Deleuze, 2006. “Post-scriptum sobre las sociedades de control,” Polis. Revista Latinoamericana, at https://journals.openedition.org/polis/5509, accessed 11 October 2021.
Gilles Deleuze, 1992. “What is a dispositif,” In: Timothy J. Armstrong. Michel Foucault: Philosopher: Essays translated from the French and German. New York: Routledge, pp. 159–168.
Brenda Dervin, 2015. “Dervin’s sense-making theory,” In: Mohammed Nasser Al-Suqri and Ali Saif Al-Aufi (editors)> Information seeking behavior and technology adoption: Theories and trends. Hershey, Pa.: IGI Global, pp. 59–180.
doi: https://doi.org/10.4018/978-1-4666-8156-9.ch004, accessed 11 October 2021.
Brenda Dervin, 1998. “Sense-making theory and practice: An overview of user interests in knowledge seeking and use,” Journal of Knowledge Management, volume 2, number 2, pp. 36–46.
doi: https://doi.org/10.1108/13673279810249369, accessed 11 October 2021.
Jacques Ellul, 1978. “Symbolic function, technology and society,” Journal of Social and Biological Structures, volume 1, number 3, pp. 207–218.
doi: https://doi.org/10.1016/0140-1750(78)90023-4, accessed 11 October 2021.
Jacques Ellul, 1964. The technological society. Translated by John Wilkinson, with an introduction by Robert K. Merton. New York: Vintage Books.
Andrew Feenberg, 2009. “Critical theory of technology: An overview,” In: Gloria J. Leckie and John E. Buschman (editors). Information technology in librarianship: New critical approaches. London: Libraries Unlimited, pp. 146–153.
Leon Festinger, 1957. A theory of cognitive dissonance. Stanford, Calif.: Stanford University Press.
Nigel Ford, 2004. “Modeling cognitive processes in information seeking: From Popper to Pask,” Journal of the American Society for Information Science and Technology, volume 55, number 9, pp. 769–782.
doi: https://doi.org/10.1002/asi.20021, accessed 11 October 2021.
Michel Foucault, 2000. Vigilar y castigar: Nacimiento de la prisión. Traducción de Aurelio Garzoón del Camino. 12a edición española. Madrid: Siglo XXI de España.
Cristian Fuchs, 2009. “Information and communication technologies and society: A contribution to the critique of the political economy of the Internet,” European Journal of Communication, volume 24, number 1, pp. 69–87.
doi: https://doi.org/10.1177/0267323108098947, accessed 11 October 2021.
Annabelle Gawer, 2014. “Bridging differing perspectives on technological platforms: Toward an integrative framework,” Research Policy, volume 43, number 7, pp. 1,239–1,249.
doi: https://doi.org/10.1016/j.respol.2014.03.006, accessed 11 October 2021.
Annabelle Gawer and Michael A. Cusumano, 2014. “Industry platforms and ecosystem innovation,” Journal of Product Innovation Management, volume 31, number 3, pp. 417–433.
doi: https://doi.org/10.1111/jpim.12105, accessed 11 October 2021.
Christiane Gebhardt, 2017. “Humans in the loop: The clash of concepts in digital sustainability in smart cities,” In: Thomas Osburg and Christiane Lohrmann (editors). Sustainability in a digital world: New opportunities through new technologies. Cham, Switzerland: Springer, pp. 85–93.
doi: https://doi.org/10.1007/978-3-319-54603-2_7, accessed 11 October 2021.
Anita Gurumurthy and Deepti Bharthur, 2018. “Policy frameworks for digital platforms: Moving from openness to inclusion: Research framework,” at https://itforchange.net/platformpolitics/wp-content/uploads/2018/04/Platform_Policies_Research_Framework20181.pdf, accessed 11 October 2021.
Eddie Harmon-Jones and Judson Mills, 2019. “An introduction to cognitive dissonance theory and an overview of current perspectives on the theory,” In: Eddie Harmon-Jones (editor). Cognitive dissonance: Reexamining a pivotal theory in psychology. Second edition. Washington D.C.: American Psychological Association,. pp. 3–24.
Eddie Harmon-Jones and Cindy Harmon-Jones, 2007. “Cognitive dissonance theory after 50 years of development,” Zeitschrift für Sozialpsychologie, volume 38, number 1, pp. 7–16.
doi: https://doi.org/10.1024/0044-3522.214.171.124, accessed 11 October 2021.
Birger Hjørland, 2017. “Domain analysis,” In: Birger Hjørland and Claudio Gnoli (editors). Encyclopedia of knowledge organization, at https://www.isko.org/cyclo/domain_analysis.htm, accessed 1 June 2021.
Birger Hjørland, 2002. “Domain analysis in information science: Eleven approaches &nmdash; traditional as well as innovative,” Journal of Documentation, volume 58, number 4, pp. 422–462.
doi: https://doi.org/10.1108/00220410210431136, accessed 11 October 2021.
Birger Hjørland and Hanne Albrechtsen, 1995. “Toward a new horizon in information science: Domain-analysis,” Journal of the American Society for Information Science, volume 46, number 6, pp. 400–425.
Institute for Information Competence and Information Infrastructure (IICIIS), 2017. “Defining information competence” (20 December), at https://iiciis.org/international/2017/12/20/defining-information-competence/, accessed 4 June 2021.
IT for Change, 2017. “Background paper: Policy frameworks for digital platforms — Moving from openness to inclusion,” at https://itforchange.net/background-paper-policy-frameworks-for-digital-platforms-moving-from-openness-to-inclusion, accessed 11 October 2021.
Heidi Julien, Jen Pecoskie, and Kathleen Reed, 2011. “Trends in information behavior research, 1999–2008: A content analysis,” Library & Information Science Research, volume 33, number 1, pp. 19–24.
doi: https://doi.org/10.1016/j.lisr.2010.07.014, accessed 11 October 2021.
Heidi Julien, Lynne E.F. McKechnie, and Sandra Hart, 2005. “Affective issues in library and information science systems work: A content analysis,” Library & Information Science Research, volume 27, number 4, pp. 453–466.
doi: https://doi.org/10.1016/j.lisr.2005.08.004, accessed 11 October 2021.
Daniel Kahneman and Amos Tversky, 1979. “Prospect theory: An analysis of decision under risk,” Econometrica, volume 47, number 2, pp. 263–292.
doi: https://doi.org/10.2307/1914185, accessed 11 October 2021.
Shanthi Kalathil and Taylor C. Boas, 2001. “The Internet and state control in authoritarian regimes: China, Cuba and the counterrevolution,” First Monday, volume 6, number 8, at https://firstmonday.org/article/view/876/785, accessed 11 October 2021.
doi: https://doi.org/10.5210/fm.v6i8.876, accessed 11 October 2021.
Patricia Katopol, 2012. “Information anxiety and African-American students in a graduate education program,” Education Libraries, volume 35, numbers 1–2.
doi: https://doi.org/10.26443/el.v35i1-2.313, accessed 11 October 2021.
Patricia Katopol, 2010. “Information anxiety, information behavior, and minority graduate students,” Proceedings of the American Society for Information Science and Technology, volume 47. number 1, pp. 1–2.
doi: https://doi.org/10.1002/meet.14504701327, accessed 11 October 2021.
Martin Kenney and John Zysman, 2016. “The rise of the platform economy,” Issues in Science and Technology, volume 32, number 3, pp. 61–69, at https://issues.org/rise-platform-economy-big-data-work/, accessed 11 October 2021.
Carol C. Kuhlthau, 1993. “A principle of uncertainty for information seeking,” Journal of Documentation, volume 49, number 4, pp. 339–355.
doi: https://doi.org/10.1108/eb026918, accessed 11 October 2021.
Mary R. Lea and Kathy Nicoll (editors), 2002. Distributed learning: Social and cultural approaches to practice. London: Routledge.
Benjamin C. Lee, 2007. “Flattening the world efficiently: Digital sustainability for the 21st century,” at https://www.seas.upenn.edu/~leebcc/documents/lee2007-stgallen.pdf, accessed 11 October 2021.
Bruce Y. Lee, 2021. “As COVID-19 vaccine microchip conspiracy theories spread, here are responses on Twitter,” Forbes (9 May), at https://www.forbes.com/sites/brucelee/2021/05/09/as-covid-19-vaccine-microchip-conspiracy-theories-spread-here-are-some-responses/, accessed 11 October 2021.
Iara Lessa, 2006. “Discursive struggles within social welfare: Restaging teen motherhood,” British Journal of Social Work, volume 36, number 2, pp. 283–298.
doi: https://doi.org/10.1093/bjsw/bch256, accessed 11 October 2021.
J.C.R. Licklider, 1960. “Man-computer symbiosis,” IRE Transactions on Human Factors in Electronics, number 1, pp. 4–11.
doi: https://doi.org/10.1109/THFE2.1960.4503259, accessed 11 October 2021.
Robert K. Logan, 2010. Understanding new media: Extending Marshall McLuhan. New York: Peter Lang.
Jean-François Lyotard, 2004. La condición postmoderna: informe sobre el saber. Traducción de Mariano Antolín Rato. 8a edición española. Madrid: Ediciones Cátedra.
Donald MacKenzie and Judy Wajcman (editors), 1985. The social shaping of technology: How the refrigerator got its hum. Milton Keynes: Open University Press.
Thomas W. Malone and Michael S. Bernstein (editors), 2015. Handbook of collective intelligence. Cambridge Mass.: MIT Press.
Konstantinos Manikas and Klaus Marius Hansen, 2013. “Reviewing the health of software ecosystems — A conceptual framework proposal,” Fifth Workshop on Software Ecosystems (IWSECO), pp. 33–44, and at http://ceur-ws.org/Vol-987/3.pdf, accessed 11 October 2021.
Marshall McLuhan, 1994. Understanding media: The extensions of man. Cambridge Mass.: MIT Press.
Marshall McLuhan, 1977. “Marshall McLuhan: The medium is the message,” ABC Radio National Network, at https://www.abc.net.au/rn/legacy/features/mcluhan/, accessed 11 October 2021.
Marshall McLuhan and Quentin Fiore, 1996. The medium is the massage: An inventory of effects. Berkeley, Calif.: Gingko Press.
Calvin N. Mooers, 1996. “Mooers’ Law or why some retrieval systems are used and others are not,” Bulletin of the American Society for Information Science and Technology, volume 23, number 1, pp. 22–23.
doi: https://doi.org/10.1002/bult.37, accessed 11 October 2021.
Evgeny Morozov, 2011. The net delusion: The dark side of Internet freedom. New York: PublicAffairs.
Fabian Muniesa, 2014. The provoked economy: Economic reality and the performative turn. London: Routledge.
doi: https://doi.org/10.4324/9780203798959, accessed 11 October 2021.
Diane Nahl, 2004. “Measuring the affective information environment of Web searchers,” Proceedings of the American Society for Information Science and Technology, volume 41, number 1, pp. 191–197.
doi: https://doi.org/10.1002/meet.1450410122, accessed 11 October 2021.
Diane Nahl and Dania Bilal (editors), 2007. Information and emotion: The emergent affective paradigm in information behavior research and theory. Medford, N.J.: Information Today.
Olufemi J. Ojo, 2016. “Information anxiety and information overload of undergraduates in two universities in south-west Nigeria,” Library Philosophy and Practice, volume 2, number 1, at https://digitalcommons.unl.edu/libphilprac/1368/, accessed 11 October 2021.
Shan L. Pan and Sixuan Zhang, 2020. “From fighting COVID-19 pandemic to tackling sustainable development goals: An opportunity for responsible information systems research,” International Journal of Information Management, volume 55, 102196.
doi: https://doi.org/10.1016/j.ijinfomgt.2020.102196, accessed 11 October 2021.
Peter L.T. Pirolli, 2007. Information foraging theory: Adaptive interaction with information. New York: Oxford University Press.
doi: https://doi.org/10.1093/acprof:oso/9780195173321.001.0001, accessed 11 October 2021.
Peter L.T. Pirolli and Stuart K. Card, 1999. “Information foraging,” Psychological Review, volume 106, number 4, pp. 643–675.
doi: https://doi.org/10.1037/0033-295X.106.4.643, accessed 11 October 2021.
Arnaud Pocheville, 2015. “The ecological niche: History and recent controversies,” In: Thomas Heams, Philippe Huneman, Guillaume Lecointre, and Marc Silberstein (editors). Handbook of evolutionary thinking in the sciences. Dordrecht: Springer, pp. 547–586.
doi: https://doi.org/10.1007/978-94-017-9014-7_26, accessed 11 October 2021.
Vasja Roblek, Maja Meško, and Zlatka Meško Štok, 2016. “Digital sustainability in the fourth industrial revolution,” Proceedings of the ENTRENOVA — ENTerprise REsearch InNOVAtion Conference, volume 2, pp. 142–146, at http://hdl.handle.net/10419/183711, accessed 11 October 2021.
Shelly Rodgers and Esther Thorson (editors), 2017. Digital advertising: Theory and research. Third edition. London: Routledge.
doi: https://doi.org/10.4324/9781315623252, accessed 11 October 2021.
J.C. Rodrigues, 2017. Plataformas Digitais para profissionais de marketing e comunicação. Segundo edição. Rio de Janeiro: E-book Kindle.
Luiz Ojima Sakuda, 2016. “Plataformas como novo tipo de governança de cadeias globais de valor: estudo na indústria de jogos digitais,” Tese de Doutorado, Universidade de São Paulo, USP, São Paulo.
doi: https://doi.org/10.11606/T.3.2016.tde-18082016-132259, accessed 11 October 2021.
Lucia Santaella and Renata Lemos, 2010. Redes sociais digitais: a cognição conectiva do Twitter. São Paulo: Paulus.
Reijo Savolainen, 1995. “Everyday life information seeking: Approaching information seeking in the context of ‘way of life’,” Library & Information Science Research, volume 17, number 3, pp. 259–294.
doi: https://doi.org/10.1016/0740-8188(95)90048-9, accessed 11 October 2021.
Michel Serres, 2003. Hominescências: O começo de uma outra humanidade? Rio de Janeiro: Bertrand Brasil.
Herbert A. Simon, 1990. “Bounded rationality,” In: John Eatwell, Murray Milgate, and Peter Newman (editors). Utility and probability. London: Palgrave Macmillan, pp. 15–18.
doi: https://doi.org/10.1007/978-1-349-20568-4, accessed 11 October 2021.
Cecelia Smith-Schoenwalder, 2020. “CDC: Some people did take bleach to protect from coronavirus,” U.S. News & World Report (5 June), at https://www.usnews.com/news/health-news/articles/2020-06-05/cdc-some-people-did-take-bleach-to-protect-from-coronavirus, accessed 11 October 2021.
Amy Davidson Sorkin, 2020. “The dangerous coronavirus conspiracy theories targeting 5G technology, Bill Gates, and a world of fear,” New Yorker (24 April), at https://www.newyorker.com/news/daily-comment/the-dangerous-coronavirus-conspiracy-theories-targeting-5g-technology-bill-gates-and-a-world-of-fear, accessed 11 October 2021.
Lance Strate, 2008. “Studying media as media: McLuhan and the media ecology approach,” MediaTropes eJournal, volume 1, pp. 127–142, and at https://mediatropes.com/index.php/Mediatropes/article/view/3344, accessed 11 October 2021.
Amrit Tiwana, 2014. Platform ecosystems: Aligning architecture, governance, and strategy. Waltham, Mass.: Morgan Kaufmann.
doi: https://doi.org/10.1016/C2012-0-06625-2, accessed 11 October 2021.
Amrit Tiwana, Benn Konsynski, and Ashley A. Bush, 2010. “Research commentary — Platform evolution: Coevolution of platform architecture, governance, and environmental dynamics,” Information Systems Research, volume 21, number 4, pp. 675–687.
doi: https://doi.org/10.1287/isre.1100.0323, accessed 11 October 2021.
Yann Truong and Geoff Simmons, 2010. “Perceived intrusiveness in digital advertising: Strategic marketing implications,” Journal of Strategic Marketing, volume 18, number 3, pp. 239–256.
doi: https://doi.org/10.1080/09652540903511308, accessed 11 October 2021.
Amos Tversky and Daniel Kahneman, 1974. “Judgment under uncertainty: Heuristics and biases: Biases in judgments reveal some heuristics of thinking under uncertainty,” Science, volume 185, number 4157 (27 September), pp. 1,124–1,131.
doi: https://doi.org/10.1126/science.185.4157.1124, accessed 11 October 2021.
Galit Wellner, 2019. “Digital cultural sustainability,” In: Róisín Lally (editor). Sustainability in the Anthropocene: Philosophical essays on renewable technologies. London: Lexington Books.
Thomas D. Wilson, 2000. “Human information behavior,” Informing Science, volume 3, number 2, pp. 49–56.
doi: https://doi.org/10.28945/576, accessed 11 October 2021.
Richard Saul Wurman, 1990. Information anxiety: What to do when information doesn't tell you what you need to know. New York: Bantam.
Shoshana Zuboff, 2013. “Be the friction: Our response to the New Lords of the Ring,&edquo; Frankfurter Allgemeine, at http://www.faz.net/aktuell/feuilleton/the-surveillance-paradigm-be-the-friction-our-response-to-the-new-lords-of-the-ring-12241996.html, accessed 11 October 2021.
Received 10 September 2021; accepted 8 October 2021.
Este obra está licenciado com uma Licença Creative Commons Atribuição-NãoComercial-SemDerivações 4.0 Internacional.
The technological informavore: Information behavior and digital sustainability in the global platform ecosystem
by Cristian Berrío-Zapata, Ester Ferreira da Silva, and Marise Teles Condurú.
First Monday, Volume 26, Number 11 - 1 November 2021