The new world of data: Four provocations on the Internet of Things
First Monday

The new world of data: Four provocations on the Internet of Things by Steven Weber and Richmond Y. Wong



Abstract
The development and deployment of Internet of Things (IoT) is rapidly emerging as the next major step in the ongoing evolution of the digital society and economy. To gain better insight and foresight into key characteristics that will differentiate this more intensely connected future from the present, we shift the focus of attention in this paper from the Internet of ‘things’ per se, to the data that the Internet of Things will generate. We put forward four provocations about IoT data that pose what we argue will be the most critical questions about business models, privacy, economic geography, and security. The next phase of Internet development will raise new challenges — in particular, around risk, governance, and responsibility — that we articulate here as a forward-looking research and action agenda aimed at maximizing the upside potential of IoT over the next decade.

Contents

Introduction
What is the Internet of Things?
Four provocations on how IoT will evolve
1. IoT business models will include open and closed logics at different stages of production
2. Privacy debates will center around managing personal data flows
3. There will be many IoTs rather than one IoT
4. The security landscape will expand and shift
IoT and the new phase of the Internet: Next steps in the political economy of data

 


 

Introduction

“The Internet of Things is here,” reads recent claims in the technology, education, and business press (Hudson, 2016; Perton, 2015; Smith, et al., 2016). Engineers, researchers, governments, companies, and consumers have celebrated the possibilities presented by ‘connected’ homes, ‘smart’ devices, and the ubiquitous placement of sensors in contexts as varied as vehicles, street-lamps, refrigerators, and wearable devices. In an important sense, however, the Internet has always been an ‘Internet of things’. Originally conceived as a way to create network inter-operability among computers running different operating systems, it is the connections among devices, machines, and other ‘things’ that are the essence of Internet technology. So why has the phrase Internet of Things (IoT) become the techno-obsession of recent years? What do these “things” do that are different? What does IoT’s generation of new data mean in practice and what are some of the most important implications of how this new phase of Internet development will unfold?

In this paper, we engage with the implications of IoT development and most importantly, the data that are and will be generated in these settings. Our goal is not to predict how IoT environments will develop — indeed, the history of technology predictions and prognostications suggests that attempts to predict the development of complex techno-social systems will be notoriously wrong and most notable for the big important things that they miss (Pogue, 2012). Rather, we introduce four provocations that isolate and highlight some of the most important and most uncertain driving forces that will bear on the future IoT, and develop some reflections about how the political economy of this next phase of the Internet will interact with privacy, security, politics, and business model issues at a global scale. In some respects, this discussion builds on the extensive work that raises questions and concerns about the broader political economy of ‘big data,’ such as questioning of the assumptions embedded in the discourses of big data, raising concerns about state power and the role of citizens in data collection and use, and addressing issues about privacy and surveillance (boyd and Crawford, 2012; Sadowski and Pasquale, 2015; Lange and Waal, 2013; Book and Bronk, 2016; Tufekci, 2014). The development and deployment of IoT technologies will extend the scale and scope of the data economy in important ways; but it will also present conceptually new challenges beyond the horizon of existing debates about data.

Our effort here is not intended as simply a critique of IoT. Rather, we introduce these provocations to highlight important questions that we believe have not been adequately discussed and should be addressed in the development and deployment of IoT systems, to cast light on pivotal social, legal, economic, and technical decisions that must be made in relation to IoT.

Acknowledging that the Internet of Things represents a complex assemblage of technologies, people, practices, and institutions, we present a set of arguments that highlight how the different configurations of IoT development and deployment can raise new challenges in a world with IoT data. In the following sections, we describe a working definition for the ‘Internet of Things’, introduce four provocations that highlight uncertainties in how IoT will be developed and deployed, and end by discussing some key implications of these insights.

 

++++++++++

What is the Internet of Things?

The Internet started by connecting computers; in its second major phase it connected people and organizations. A third major phase of connectivity now emerging is about connections between ‘things’. While there is no precise and agreed definition of IoT, what exists is a proliferation of descriptive phrases which imply that something resembling a difference in kind (not just a difference of degree) is happening or is about to happen at the intersection of these things and the networks that connect them (Bassi and Horn, 2008). Importantly, the ‘things’ that IoT will connect subsume and go beyond devices with computational capabilities, to include any and potentially all devices that have some ability to sense their environment or generate data about their interactions with other devices and/or people. This represents an order-of-magnitude step function in connectivity since the world has people and computers in the billions, but devices in the trillions or higher. Just as the second phase of Internet connectivity (involving people) was categorically different than the first phase, the IoT phase will be as different and its interactions with and impacts on society, economy, markets, politics, and life as profound. We think a simple, pragmatic definitional approach to IoT highlights two components, each of which is descriptive on a particular dimension but together add up to a practical, usable definition.

Component 1 is about how data flows. Older distinctions (like Machine-to-Machine or Business-to-Consumer, M2M or B2C) are becoming obsolete in the IoT, where data moves in a more truly networked fashion that disregards most of those boundaries. Component 2 is about the granularity of those data flows. Think for a moment of any contemporary process where data that is collected, transmitted, and processed is not very granular in location, time, or similar dimensions (for example, blood pressure readings taken once a year at the doctor’s office and irregularly at home). Now imagine a data stream with an order of magnitude greater granularity — such as blood pressure readings from a wearable device that takes a measurement every minute.

The Internet of Things combines these two definitional components. As data flows move toward becoming continuous, 24/7, and correlated through networked connectivity with many other data flows of similar granularity, we have something that is more likely to demonstrate a difference of kind and thus be called IoT. That difference is also a characteristic of the data sets that IoT will enable and create. At a certain point, these become functionally new data assets with fundamentally different affordances, rather than simply familiar data sets at a finer scale. This is not to say the IoT network ‘map’ will lack any relevant topological differentiation, only that the conventional and expected differentiators are likely to be supplanted by new ones, for example distinctions between sensors that collect data about mechanical systems only and sensors that collect data directly about human behavior or peoples’ bodies. (Gan, et al., 2011; Jing, et al., 2014). Those boundaries too are likely to be broken down in many or most cases, but will also be subject to different levels of scrutiny (as we discuss later) because of their inherently greater promise and risk.

Our definition foregrounds the ‘sensing’ side of the IoT not the ‘acting’ side; it puts sensors rather than actuators at the center of the discussion. That is a pragmatic choice that we believe makes sense for now, in part for the simple reason that the effects of sensors will be experienced before the effects of actuators, and in part because most current IoT devices tend to make more use of sensors than actuators. Examples of these types of sensors include wearable health sensors and patient health monitoring devices, urban environmental sensors as part of “smart city” projects, and sensors that provide data and analysis of business supply chains or manufacturing equipment (Fitbit, 2016; Array of Things, 2016; Intel, 2016). The distinction between sensors and actuators probably will not hold up over the long term as the technologies progress and the value of direct linkage between the equivalent of sensory and motor neurons (through the analogy of reflex pathways) becomes increasingly evident. But because the choice to create reflex pathways is one of the major decisions about the evolution of IoT that we want to illuminate, keeping the two separate for now makes analytic sense as well.

Prior research in ubiquitous computing foreshadows some of the concerns raised in the Internet of Things. Ubiquitous computing, first outlined by Mark Weiser, described the future of computing beyond desktop terminals. Ubiquitous computers would be always present and in the background, weaving themselves “into the fabric of everyday life until they are indistinguishable from it”; computers would “reside in the human world and pose no barrier to personal interaction” (Weiser, 1991). Ensuing research not only studied the technical means of ubiquitous computing, but also the sociotechnical aspects, such as ubiquitous computing’s privacy implications (Bellotti and Sellen, 1993; Hong and Landay, 2004). Interest in ubiquitous computing spanned business and government (Fano and Gershman, 2002; Scholtz, et al., 2001). Dourish and Bell (2011) discuss ways in which ubiquitous computing has developed in various countries — not always in the way Weiser envisioned. This work helped develop the technical foundations necessary for the Internet of Things, but also began to identify some of the sociotechnical issues that would become more pressing and readily apparent with the widespread deployment of IoT.

Why then, if the Internet has always been an Internet of things, and ubiquitous computing research and deployments have occurred for decades, has the phrase IoT burst onto the scene in the last few years? One hypothesis is that it is mainly a function of the hype cycle in modern information technology (Gartner, 2016), intentional or otherwise. Another hypothesis is that the technology has entered the ‘frenzy’ phase in Carlota Perez’s (2002) model, which is driven by financial cycles rather than media hype per se.

However, we think there is something simpler and more profound in play that marks IoT as something different. Five developments have come together in time to make today’s IoT (and tomorrow’s) something different, possibly in the same way that the development of the World Wide Web was transformative and more than simply an application running on the Internet.

The first ingredient is the rapid decline in cost and size of sensors. The second is nearly ubiquitous and inexpensive wireless connectivity. The third is distributed ‘super’-computing, conveniently masquerading as mobile phones. The fourth is the recent development and widespread deployment of software tools for managing and working with very large data sets. Fifth is the development of an ecosystem of knowledge, techniques, institutions, and capital that purport to make valuable use out of large quantities of data.

Putting those five ingredients together creates a network that has a distinctively higher level of interdependencies and new potential classes of applications, and thus deserves a new label like IoT (Zelenkauskaite, et al., 2012). Many potential applications have been written about in speculative but exciting stories that capture a kind of excitement similar to what the Web felt like in its earliest years. As with many ‘new’ technology trajectories it is easy and seductive to generate visions of proposed uses. It is equally easy and almost as seductive to generate hypotheticals about downside risks, in particular, privacy and security concerns that could conceivably emerge as the IoT unfolds and new data sets are both created and combined. We are confident that almost all of these visions will be wrong in their particulars. The system under development is being driven forward by a variety of causes that will intersect with each other in ways that are difficult or impossible to predict or model. Indeed, the hypotheticals on both the upside and the downside create a socio-technical-economic narrative about IoT that is itself one of the driving forces of change, and self-reflexive systems like that are even harder to model [1].

Our approach in the next section is more modest. We introduce several statements that isolate and highlight four of what we believe to be key driving forces of the IoT ecosystem, without making point predictions about how they will manifest. Even if we knew precisely how any single driving force will play out, the IoT will evolve as a function of complex permutations among these. The evolution will be further influenced by other exogenous shocks and factors that we have not foreseen, as was the case with the World Wide Web and most other complex technology trajectories. These caveats are an essential aspect of the partial model we will propose in the final section, where we put forward an argument about the most important and most uncertain variables in the IoT story.

 

++++++++++

Four provocations on how IoT will evolve

We propose four provocations about key driving forces that will come together in the evolution of the IoT. These are not meant to be seen as predictions, since the outcome of each driving force is uncertain, and even more importantly the permutation of ways that these forces interact with each other is unknown. But by isolating them to start, we hope to ground a discussion about core elements of how we want to and how we should proceed in an IoT world. The first key driving force is about business models, particularly the open-closed dimension around standards, data, and intellectual property. The second is an expanded and revised notion of what is commonly called ‘privacy’, but we prefer to call personal data flows (PDFs). The third is a structural, political economy question: will there be one IoT, or many IoTs distinct from each other in national or sectoral bases? The last is about the security dynamic distinctive to IoT systems extended along a massive, unprecedented attack surface.

 

++++++++++

1. IoT business models will include open and closed logics at different stages of production

The question ‘what will be the business model for the IoT’ is over-aggregated and too abstract to be useful. (Hase, 2012; Chan, 2015; Turber, et al., 2014). It cannot be answered at that level because ‘things’ is not a category of value creation. More important, the central queries that one would normally pose about a business model such as where is value created, where are rents extracted, and what is the relationship between those two variables and the overall vitality of the business system, need to be broken down and made more specific in order to make sense.

A better starting place is the simple concept of a value chain, and the question of how value chains will emerge in the IoT. Consider an analogy to a contemporary broadband value chain for over-the-top TV, which includes at least the following elements:

  • content creation (making the TV show)
  • storage (storing the bits for delivery when needed)
  • search (finding the show that a customer wants)
  • delivery (getting the stream to the customer’s device when needed)
  • presentation (transforming the bits into video and sound)
  • customer management and billing (payment processing, marketing, advertising, etc.)

At each step along that value chain there are firms that make things (a camera, a TV set, a fiber optic cable); firms that provide services (caching, editing); firms on the periphery whose auxiliary inputs support the main value creation system (insurance, IP protection); and others.

IoT value chains will have similar nodes and modules. The question we want to put in the foreground (and which will become a more important question over time) is, what will be the open-closed characteristics of each of those nodes. Open and closed in this respect is more a continuum than a dichotomous choice, but the distinction still matters. In the broadband TV value chain example, there is ‘professional’ content creation by a partially vertically integrated firm like Netflix that lies toward the closed end of the spectrum. And there is the YouTube model of distributed content creation with essentially no barriers to entry, which lies toward the open end. Business models are quite different depending on which value chain you are in, and analogous distinctions will be seen in IoT value chains. This will be especially important when it comes to the permissions and affordances that attach to a particular data set that is created by or associated with a particular node in an IoT system.

To see this clearly, consider what would happen if any of the broadband value chain components were to change their position on the open-closed dimension in a significant way. Imagine for example that Internet neutrality regulation goes away and broadband providers like Comcast and Verizon charge differential rates for quality of service; or imagine that Amazon Web Services decides as a matter of policy that it will not provide storage services for video other than Amazon video. Or imagine that current wireless spectrum allocation and assignment debates play out such that more frequencies are licensed to AT&T and Verizon to build out their subscriber mobile wireless systems at the expense of smaller competitor wireless companies’ systems or at the expense of frequencies that allow for Wi-Fi or other unlicensed usage. These kinds of moves toward the closed end of the continuum impact the entire business system, which would take on new characteristics and evolve in significantly different ways. Customers might, for example, be under much greater pressure than they are today to ‘choose sides’ and at least partially lock into semi-walled gardens where they would receive a much higher level of personalized service.

Now consider an IoT analogy to those kinds of changes, for example in what Andrew Brooks has called ‘instrumented consumer experiences’ (Brooks, 2012), the value chain that today centers on wearable ‘fitness tracking devices’ but soon will likely incorporate an entire suite of IoT functions (these could range from mood measurement and mood stimulus to health indicators and drug delivery and more). Possibly the most important decision that organizations which build these devices (such as Fitbit, Nike Fuel, Garmin X, or even the Apple Watch) have to make is how they will treat the data streams that the devices create. Those streams could, for example, be offered in an open manner through an API for anyone to build additional applications and services. Alternatively, they could be nearly locked into a closed ecosystem where only the user and the firm have access. They could be licensed for use on relatively open terms, or relatively closed terms.

As Brooks has shown, a wide variety of de facto experimental practices is developing around these choices precisely because no one is certain where value will be best created or how the rent extraction part of the system is best configured so as not to reduce that value creation. The shape of the overall business system, including the question of what services get provided, how much they cost, who pays for them, and who benefits, is going to be an outcome of how these decisions unfold. The same dynamic is present in the industrial IoT. If GE, for example, were to close the data stream coming off instrumented jet engines and keep its data science, maintenance, financing, and ‘thrust as a service’ offerings tightly tied to GE businesses, the system would tend toward one kind of IoT landscape that would likely coalesce around a small number of very large players, approaching natural monopolies. If GE were to fully open that data stream and invite IBM, Google, or anyone else to work with it; or if GE allows others to place sensors on its jet engines; or if GE creates an open marketplace for data-enabled maintenance systems; and so on — we end up with a very different kind of IoT landscape that would be more diverse but also likely more fragmented and difficult to manage. The consumer version of this distinction and some of its consequences is currently being seen in competing visions of the connected home offered by a variety of device manufacturers, software companies, and service providers. There is a common belief that, all things being equal, open ecosystems create more value (Schaffers, et al., 2011). But, at the same time, it is openness, diversity, fragmentation, and multiple standards, that are being blamed at least partially for the slow uptake of these systems by consumers.

Firms will not be the only actors determining what parts of the value chain are more open or more closed. Users also adopt, adapt, and create new technologies at various parts of the value chain in ways that affect ‘openness’. Bijker’s account of social construction of the safety bicycle by groups of users (Bijker, 1995), the user-development of mountain bicycles, and theories of user-led innovation (Hippel, 2005) all describe different ways in which users may develop or actively refine products and services. Sometimes these ideas migrate back to firms (such as using the iPhone’s camera flash as a flashlight, which was first implemented by an independent app developer and later implemented by Apple) (Bort, 2015). IoT has the potential to involve many communities of users who are already involved in different aspects of the value chain, such as: software developer communities like GitHub; electronic hardware communities such as those around Raspberry Pi or Arduino; and other hardware groups such as home do-it-yourself (DIY) communities. Interactions between these user groups may take on more adversarial forms (such as iPhone “jailbreakers”), or more cooperative forms (such as online app stores and independent app developers). Decisions about open vs. closed parts in the value chain can also affect how users experience IoT devices and services, and where in the value chain it is seen as legitimate for users to participate. The connections in these vertical (and horizontal) value chains reflect the interdependencies that in part define IoT. However, if users if they believe that they are only interacting with one device or service, these value chains and interdependencies may be hidden from them, which may be more likely if the user-facing elements lean toward the closed dimension. Ultimately, decisions and debates about openness and closure will also be about who has legitimate access and who does not at different stages of the value chain. Corporate strategies and regulatory action strongly influence but do not always determine what is legitimate.

The open vs. closed dimension will be one of the most important characteristics of how IoT value chains evolve. Common ideological dispositions to prefer the more open system topologies notwithstanding, business systems do not always equilibrate where aggregate value is greatest, particularly when very large players with market power have the capability to enforce taking a larger slice of a smaller pie. Regulators and users also have a voice in these choices. The most likely outcome is experimental and evolving mixes of open and closed elements at different stages of production and deployment — put differently, a highly varied topology where all parties are trying to figure out empirically what they actually prefer and what they can accept. It may be quite a long time before this settles down to equilibrium. This extends in an interesting way debates within data science and ’big data’ around access to particular data sets, whose permissions and affordances may be increasingly subject to (or even a function of) the business models of the IoT players who develop and deploy physical hardware devices.

 

++++++++++

2. Privacy debates will center around managing personal data flows

Where IoT is first revealing itself as a difference in kind rather than degree may very well be in debates about privacy. Many of these debates, already at a fevered pitch, have been characterized either by the development of semi-worst case hypotheticals that are intended to stir up consumer and regulatory action; or by particular firm and/or government decisions which cross some (often previously unforeseen) saliency of attention. The nature of IoT applications makes it inevitable that both will occur more frequently and at higher intensity. Put simply, to create somewhat sensational hypotheticals about privacy intrusion by pervasive IoT deployments is easy, particularly when people think about the possible insights that might come from merging different IoT datasets. It is just as easy to project that firms and governments will cross lines that are difficult to define in advance, because their business models (in the case of firms) and their data collection efforts, as well as law enforcement and security applications (in the case of governments) will create almost irresistible incentives to do so. If Target could predict a customer’s pregnancy in 2012 and the NSA could do some of the things that Edward Snowden revealed in 2013, one can imagine what kinds of perceived lines will be crossed as IoT deployments move forward, where data collection is orders of magnitude broader, greater, and more granular than those of 2012 and 2013 [2]. It becomes easy to see how potential for IoT deployments could precipitate numerous privacy harms including disclosure of sensitive information, unwarranted surveillance, unwanted identification, opaque aggregation of data, or deep intrusions into private affairs (Solove, 2006). One of the most important insights emerging from debates about data science in this context, is that these harms may come about regardless of intentionality, as unforeseen types of findings surface from experimental manipulation and connection of data sets that may not have been designed or intended for the purposes to which they are eventually put.

What is harder to see is how rules and norms that would constrain collisions can be developed in advance of these kinds of publicity-generating events. The problem is not just the speed at which the technology is advancing, though that is considerable. The more profound problem is that some of the core principles that serve as a rough guide in today’s privacy debates do not easily translate to an IoT world.

Consider for example the notion of data minimization — that organizations should generally seek to limit the data they collect to those that are ‘strictly necessary’ for whatever task or service they offer (U.S. Federal Trade Commission (FTC), 2013; Organisation for Economic Co-operation and Development (OECD), 2013). This is a hard principle to define under any circumstance where the value of the next piece or class of data a firm might collect is often not simply high or low on its own, but depends upon many other variables in play. The role of a particular type of data is almost impossible to define in IoT applications where the real promise lies in combinations and permutations of data that multiply insight in sometimes surprising ways — a fundamental notion behind modern data science that is deeply embedded in the culture of that science. Concretely, an IoT thermostat can certainly justify collecting data about the temperature in a house at any given moment on the ‘strictly necessary’ criteria. But can it justify collecting data about the residents’ ages, emotional status, health history, and eating habits? Arguably, those other data types could affect the temperature at which the residents will feel comfortable in their living room, but it is less immediately clear how to decide how ‘necessary’ those data are.

In this context the heterogeneity of IoT data conceals what may become a kind of privacy trap. It is tempting to draw a simple distinguishing line between IoT systems that do not touch people directly and those that do — for example, between a flow sensor in a municipal water purification facility, and a wearable device that tracks sleep — and apply more stringent privacy or data protection standards to the latter. But this is probably not a sustainable distinction in which users, developers, or researchers should feel confident, as scientific, economic, and other drivers align to incentivize the discovery of unforeseen relationships that bear on human behavior (even if the original intent of data collection was not for those purposes).

Consider as a second core principle “notice and choice,” the notion that companies must employ fair information practices that give users the power to protect their privacy. These ideas were first outlined in the 1973 U.S. government report “Records, computers and the rights of citizens,” meant to protect citizens’ privacy as records started to become digitized. Since then, the U.S. Federal Trade Commission and other federal agencies have reiterated and evolved these ideas through several instantiations of fair information practices (FIPs) (Gellman, 2016). Notice and choice are two of the core FIPs, which include providing notice to users about how their personal information will be used, allowing them to have some choice about how their data is used, and allowing them access to see what data has been collected about them. These principles have been re-expressed internationally as “purpose specification” and “individual participation” through the OECD privacy framework (Organisation for Economic Co-operation and Development (OECD), 2013).

As tricky as the notice and choice model has been in the age of click-through licenses (Solove, 2012), complex privacy policies, and terms of service agreements that would take hundreds of hours to read annually (McDonald and Cranor, 2008), the IoT takes this problem to a new order of magnitude. If an IoT application is operating (as it often will and should) in the ‘background’ and constantly optimizing some process without ongoing human intervention or even awareness, how would notice and choice look in practice? When IoT applications are deployed ubiquitously, there may no longer be any type of visible user interface that could display a notice or privacy policy to the user. Even if there were, the time it would take for users to be informed in a meaningful way about the scale of multiplicity of systems would rise to a level that is sufficiently burdensome to undermine the whole point of the system. The interdependent nature of IoT frustrates the principle of notice and choice. Given the push by firms to design user experiences that are ‘simple’ and ‘intuitive’, users may think that they are providing data to a device or service for one purpose, while the interdependencies with other service providers means that users’ data are processed for several other purposes or disseminated to third party data processing companies.

The FTC made a heroic effort in 2013 to define some parameters here by distinguishing between ‘expected’ and ‘unexpected’ uses of data; and data that lies ‘within the context of the interaction’ versus that which does not (U.S. Federal Trade Commission (FTC), 2013). But these distinctions will be impossible to sustain — they are highly contextual and very much dependent on who it is that is doing the ‘expecting’ or defining the boundaries of the interaction. Furthermore, the basic assumptions underlying the notice and consent regime assumes that at the point of data collection, both the data collector and the users know and understand the value of data and the types of knowledge and information that can be obtained from the data. The world of data aggregation and machine learning in which IoT operates upends the assumption that individuals understand what can be learned about them when their data is collected or disclosed. The same is true of the organization, private sector or otherwise, that is collecting the data. Today, power imbalances and information asymmetries already exist such that most firms collecting data have a better understanding of that data’s significance and worth than users do (despite the goal of notice and choice to provide a more level playing field). These asymmetries will multiply as IoT spreads, and the extent may even change depending on what aspects are open and closed. Together, these imply that for the notice and choice principle as well, it will likely be salient events that drive public and regulatory attention in a catch-up mode rather than through proactive or intentional strategies.

Transparency as a core principle is problematic for similar reasons. IoT applications operating in the background could be in principle be made ‘transparent’, at least to someone, if queried. But when IoT applications combine data from many different sources with different owners and different property rights, the permissions equations around transparency soon become too complex to manage. Jenna Burrell discusses opacity in machine learning algorithms, suggesting that while some forms of transparency can be achieved through computer literacy education or compelling companies through regulation to make proprietary algorithms public, in many cases the inner workings of machine learning algorithms are inherently opaque due to the large scale they must operate at to be useful (Burrell, 2016). Recent work on ‘accountable algorithms’ represents an important effort here, but by itself is insufficient as a widespread solution for alleviating privacy concerns (Kroll, et al., 2017).

What happens next is almost entirely predictable. In practice, users will be asked to make assumptions about the fiduciary responsibility and generally benign intentions of IoT applications. Firms will stretch the limits under the rationale of promoting innovation. There will be widespread calls for the largest and most influential firms to work harder at anticipating public concerns that will follow their products [3]. But intentionally or unintentionally, IoT applications will cross boundaries that probably could not have been seen in advance, and provoke outrage.

Currently, a variety of inductive methods have been proposed as ways to escape this unfortunate loop. These methods are rooted in notions of context, user expectations, mental models, or user centered design (Nissenbaum, 2011). By not violating users’ expectations of how data flows occur, privacy is supposedly protected. However, this is much easier and more practical to apply when there are analogous off-line contexts to infer these expectations from, such as building on expectations of privacy in traditional postal mail to inform expectations of privacy in e-mail messages. IoT challenges this process by moving beyond conventional boundaries of information flows and collapsing traditional contextual boundaries, such as the difference between online and off-line. The point is that new types of data sets will likely create new user expectations and mental models, but predicting those expectations at scale in advance is nearly impossible. Even with good heuristic tools to identify potential privacy problems, it remains unclear how those problems can be meaningfully addressed. In these situations, the suite of privacy engineering and design tools and methods as currently developed will almost certainly fail to manage the new contexts in which IoT operates.

In the face of incentive structures that entice firms and governments to engage in greater and more intense personal data flows, some users may respond by attempting to take personal data flow management into their own hands through defensive obfuscation measures. These might include using mechanisms that introduce data “noise”, providing false data patterns, or obfuscating an individual’s data by having multiple people share a single account or device (Brunton and Nissenbaum, 2015). All of these strategies have been used to obfuscate data from online data trackers, such as a Web plugin that clicks every ad on a Web page to obfuscate online advertisers (AdNauseam, at https://adnauseam.io). It is reasonable to expect that some users will decide to implement these strategies in the IoT space in an attempt to maintain control over their personal data flows. These obfuscation responses may work for a small number of users and they do tend to receive a great deal of attention, but they do not add up to a comprehensive strategy for most users or a policy or business model solution for governments and firms.

Altogether these expectations around IoT personal data flows add up to a largely reactive dynamic that is most strongly responsive to politics around events that happen to break through the noise and generate public attention and wrath. An intentional evolution of policy, norms, and market constraints around personal data flows would be a more desirable outcome, but it is difficult to see how that can emerge and be sustained. The notion of ‘privacy by design,’ that privacy should be embedded in technical design and business practices throughout the full lifecycle of products, offers one vision of a proactive approach (Rubinstein, 2012; Mulligan and King, 2012), though it is still nascent and not yet clear how it would be implemented in practice.

 

++++++++++

3. There will be many IoTs rather than one IoT

The notion that there will be a ‘single’ IoT with interoperable standards across industry sectors and national borders is this technology generation’s hopeful version of the global ‘free and open’ Internet story. As with that story, the aggregate benefits of a single IoT would likely be an order or more magnitudes greater than a fragmented IoT. But this issue of IoT topology is not an aggregate benefit question, it is a question about political economy. Boundaries and fragmentation barriers could form around sectors (a medical IoT, an industrial IoT, a home IoT); around countries (a U.S. IoT, a Chinese IoT); around particular firms (a Google IoT, a GE IoT); and probably other formations yet to be seen (for example, operating systems).

Whether and how such boundaries emerge could be a function of security politics writ broadly (more on this in the following section). But the more subtle, and probably more likely determinant of boundaries, will be the political economy of data sharing. Put differently, boundaries that form in the IoT will trace the logic of how the IoT’s principal product — data — is shared and compartmentalized, among countries, firms, devices, and other boundaries. While this is also a question about the political economy of data science, asking about the political economy of the IoT foregrounds the inter-connectedness between data and physical hardware (and their relationship with the ecosystem of tools, standards, laws, business practices, knowledge production processes, capital, and so forth). Put differently, who makes, deploys, operates, and owns the devices and the network infrastructure will likely depend on how each one of those questions bears on control of data and data flows.

To see this from a microeconomic perspective, consider the following questions: If data is really the most valuable asset that a firm (or a country) possesses, why would that entity build an infrastructure that enables sharing data with competitors or possible competitors? If data is pushing aside intellectual property as the most potent value-differentiator that confers competitive advantage on one organization over others, why share? And most importantly why do that right now, in a moment of rapid technology change and explosive growth in tools that promise to extract meaning from massive data sets, a moment where no one can be certain about what really is lurking in data and what it will take to release that nascent value for competitive advantage?

In this respect, today’s IoT economy is similar to oil concessions in the early part of the twentieth century, where the various players in the ecosystem did not really know how to judge the value that might lie beneath the Earth’s crust or what it would take to drill, refine, and get its products to market. Sharing of oil concessions was not a favored strategic response to that situation (Yergin, 1991).

In practice not all data will (or should) be shared, either within or between organizations. Analogous to transaction cost theories of the firm (in which the boundaries that delineate a firm from the market are explained as a function of asset specificity and transaction costs), a fully articulated theory of data sharing would mark off the presumptive boundaries that delineate where shared data should be expected to end and non-shared data expected to begin. We are some distance from having such a theory at this point (Weber and Saxenian, 2012). But some considerations that would likely be a part of that theory are easy to see.

If data-sharing turns out to be a source of value creation that is big enough to matter on a national scale, it cannot help but become a locus of contention in national competitiveness debates over the next few years. Should national boundaries be barriers to some types of data-sharing? This question goes beyond the obvious first-tier national security questions. For example, if health IoT data becomes a critical input to new drug development (a quite plausible proposition) then it would be reasonable to expect many governments to enclose those data within national borders just as they today seek to protect intellectual property — not first and foremost because health data is a privacy concern but even more importantly because it is a competitiveness concern. Particularly in the current global macroeconomic environment where governments everywhere are struggling deeply with employment and economic growth, these lines of argument that connect data, competitiveness, and job creation will directly affect what happens with data-sharing and by implication with the IoT.

Other constraints and boundaries around data-sharing will become visible as experimentation mounts in various sectors. Organizations may encounter a worrying ‘ratchet-effect’ — once data is shared, can it ever be ‘taken back’ to individual ownership if that becomes desirable? A ratchet effect (once shared, always shared) could make the stakes of sharing seem inappropriately high and limit the willingness to experiment. There may also be proliferating efforts to establish shared data systems around separately controlled standards that create the risk of an unhelpful competition between them, splitting up the energy and efforts devoted to sharing and leaving each data ecology beneath effective scale. It is possible that cooperative relationships along the value chain that become established in the context of data sharing, would transform over time to highly competitive situations as the players expand into adjacent parts of the value chain and find themselves in new forms of competition with their erstwhile data ‘partners’.

The legal and social contexts in which IoT systems operate will also affect what data can be collected and most importantly what can be shared. What has happened with regulation of drones (unmanned aircraft systems) inside the United States is an early indicator: As of 2016, a complex landscape of varying federal and state regulations and laws have been passed that constrain the use of drones and the data they may collect and use (National Conference of State Legislatures, 2016). Other parts of IoT systems will probably encounter an equally complicated regulatory environment depending on state and municipality and the concerns raised by those regions’ citizens (California Legislature, 2015; Megerian, 2015). Users may also decide in unpredictable ways that they do not want their data to be shared by and among firms, and engage selectively or en masse in data obfuscation techniques, lessening much of the value gained through sharing data. None of these constraints and boundaries are inevitable or immovable. But the ways in which they manifest in practice as experimentation with data sharing mounts is likely to be a principle determinant of whether IoT deployments are designed for interoperability and compatibility that creates one IoT; or separation, which creates what will de facto be many IoTs. The promise of aggregate benefits from compatibility notwithstanding, political economy considerations around data in 2017 favor the trend toward fragmentation.

 

++++++++++

4. The security landscape will expand and shift

Two important vectors around security will converge in IoT over the next few years. The first vector is simply the emergence of cybersecurity as one of the most salient considerations in many corporate, government, and (increasingly) public discussions of digital technologies. The second vector is a distinctive characteristic of consumer-oriented IoT in particular (although it is not strictly confined there): that the engineering, design, and particularly the business culture of IoT generally places functionality, speed to market, price, and usability ahead of security concerns. Concretely, start-up firms that offer US$40 ‘smart’ switches and US$20 home sensors are focused on gaining first-mover advantages in the connected home; they are less focused on the security risks of vulnerable operating systems, firmware updates, and unencrypted communications [4]. And they are not likely to pay much attention to the consequence that fast product cycles will leave lots of legacy devices (likely unsupported for security updates at some point) in consumer networks.

This stands in sharp contrast for example to the fast development of cloud services, which were driven first and foremost by lines of business rather than consumer demand. Original equipment manufacturers of physical devices and products have taken a greater lead in the consumer side of the IoT, and these tend to be smaller companies with smaller budgets and less time and expertise to invest in security. A Hewlett Packard study looked at 10 of the most popular consumer IoT devices all of which included a mobile app and most of which connected to a cloud service as well: nine devices collected at least one piece of personal information; eight failed to enforce generally acceptable standards for password length and complexity; seven did not use encryption when transmitting data to the home network; six did not use encryption when downloading software updates (Hewlett Packard, 2015). Indeed, recent reports attribute distributed denial of service attacks to networks of infected ‘smart’ devices such as DVRs, cameras, and Internet-connected refrigerators (e.g., Limer, 2016).

The types of security threats that IoT faces go beyond data security threats. In addition to the release or manipulation of data sets in the application layer, infected IoT devices also provide computational power and the ability to enact change in the physical world. Attacks can occur at multiple layers (application, transport, and perception) (Gan, et al., 2011; Jing, et al., 2014) and the interconnected, interdependent, and boundary-crossing nature of the IoT broadens the attack surface (Zelenkauskaite, et al., 2012). Any vision of the smart home, the smart road, the smart building, the smart medical implant and so on (particularly if these smart devices are connected to each other) has to grapple with the observation that as the IoT expands, the attack surface expands as well. That expansion might be proportional; it is unlikely to be less than proportional. It might be at a faster rate (envision IoT deployments as the diameter of a sphere and the attack surface as the surface area of the sphere). Yet this expanded attack surface will not be ‘even,’ as different individuals, firms, and other actors face different risks and costs. For example, the cost of a compromised IoT home security system is borne principally by the homeowner, while the cost of a compromised IoT Smart Road (that might increase traffic delays or worse) is spread across a wider community. As people will be using IoT devices in a multitude of environments outside the immediate oversight of chief information security officers, security specialists, and help desks, they will also be outside the technology controls and corporate employee policies that help maintain information security practices at the workplace. And outside of these more controlled environments, people are of course known to use computers in ways that are insecure, sometimes due to a lack of technical understanding, but often because security considerations are outweighed by other values such as convenience or maintaining system stability (Felt, et al., 2012; Lampson, 2009). Thus technical understandings of security need to be accompanied even more so in this environment by study of human factors and social dimensions of users, their practices, and how they view risks and tradeoffs in order to reasonably address some of the most basic security concerns.

However this particular dynamic plays out, it seems a good bet that IoT vulnerabilities are on the cusp of intersecting, perhaps dramatically, with the broader cybersecurity debate in the public, private, and governmental spheres. IoT could then become the principal driver of those debates, in which case the relative ubiquity of IoT would turn ‘cybersecurity’ into such a broad phenomenon that it converges onto and melds with ‘security’ per se. In this shift in ‘cybersecurity,’ the standard concepts of ‘confidentiality,’ ‘integrity,’ and ‘availability’ from information security (National Institute of Standards and Technology, 2004) may not be enough to ensure protection against a broader range of security threats. This is not just a rhetorical twist (Wæver, 1995). When ‘cybersecurity’ becomes simply ‘security’; a different and broader set of actors are mobilized into the game. Budgets change; politics change; business models may be forced to change. The history of security shocks from the non-cyber world suggests one observation about what this is likely to mean — that the precise timing and the particular targets and damage from politically salient attacks have an outsized influence on the dynamics of response. In other words we should not expect a rational long term strategy for securing the IoT to emerge out of crisis; such things rarely do. Ironically, it is precisely the ubiquity of IoT and the proliferation of attack surfaces that make a reasoned and balanced approach to security less likely than a spasmodic response to an unforeseen crisis.

 

++++++++++

IoT and the new phase of the Internet: Next steps in the political economy of data

A survey paper of this kind about an emerging technology landscape requires a number of assumptions about what will not change as profoundly, in order to highlight causes and consequences that we believe will change in more impactful ways. Two of those assumptions deserve special and explicit attention because of their broad significance. The first is straightforward: We have assumed no major technology discontinuities, on the upside or downside [5]. That assumption seems reasonable for a few years but even in that time frame it is possible that Moore’s Law could stall; or new chip materials and processes could lead to a vast acceleration of processing power beyond expectations. Wireless bandwidth could become congested; or it could expand with new protocols to a point where connectivity becomes essentially costless. Beyond a few years, technology extrapolations become much more problematic and the IoT of the next decade’s technologies could make some of the trade-offs and arguments that we have put forward obsolete, while elevating others.

The second assumption reiterates our point about sensors and actuators. We have emphasized for the purposes of this paper the sensing (and by implication the data) side of IoT based on a view of comparative developments in computation and robotics. That distinction may very well collapse over time, in which case critical choices about autonomous decision making and where humans do and do not belong ‘in the loop’ will rise to the fore and could re-cast many of the issues we’ve raised. In the longer term the relationship between human and machine autonomy may be the most important choice that IoT presents to individuals and societies; research here needs an even greater sense of urgency along with more systematic and disciplined analysis going forward.

One of the most important potential consequences of IoT is the stress it places on debates about technology and risk, in the areas of privacy and security most directly but elsewhere as well. Widespread IoT deployment creates a digital attack surface that could encompass nearly everything; the operational question may shift from what is the attack surface to what is not an attack surface, and the answer may be very little. When perceived vulnerabilities approach that kind of asymptote, there is almost no limit to the range of hypothetical attacks a creative thinker can conjure up. The downside consequence of that dynamic could be a massive race to articulate worst-case scenarios that compete for attention on the basis of what is most frightening — and that is almost never a good way to inform decision- and policy-making. The upside alternative could be a more thoughtful and serious conversation about how we ought to prioritize and communicate our thinking about risk — as individuals, organizations, and societies. The sooner that IoT champions step up and frame that conversation, the better their chances to preserve a reasonable space for innovation and experimentation (Pidgeon and Fischhoff, 2011). The more open and public that conversation, the better the chances that consumers will be activated to drive markets for IoT products toward acceptable standards for how personal data flows are used and secured.

There is a second domain of political economy risk that IoT makes more urgent, touching on basic issues of governance and responsibility. It is axiomatic that all things equal, governance systems function better when they have more precise and more accurate data about what citizens want and what they do. Amid the concerns about privacy and surveillance, it is important not to lose sight of what can be achieved with accurate and specific data about what citizens desire, and how governmental and other interventions are working to address those needs and preferences. To see this, imagine a world where everyone expresses their preferences to politicians by what they do, not by what some say or by what spreads most quickly on social media; in which public opinion pollsters are dis-intermediated by facts collected through secure distributed sensors; in which lobbyists’ pleas are constrained by data.

The attractiveness of that world depends greatly on how level a playing field IoT can create for citizens of different incomes, ages, and levels of technical knowledge. The simplest example illustrates the issue: If public sector services become increasingly data-driven and reliant on IoT sensors, then the very basic question of where those IoT sensors are located matters greatly. There is a well-known anecdote about the Boston pothole app which notified city service providers about potholes and queued them up for repair — but of course, the notification rate was meaningfully higher in parts of the city with higher percentages of smartphone users. An IoT that tilts the playing field to favor rich or technologically sophisticated users is not an IoT that fulfills this upside promise. At the same time, it would slow experimentation and progress enormously to take away reasonable innovation permissions and require universal access for all IoT applications, as if it were ‘plain old telephone service’ again (Nuechterlein and Weiser, 2013). Avoiding the next iteration of a ‘digital divide’ — this one not about access per se but rather about data and IoT applications — is a major public policy and economic challenge, possibly one of the most important to be faced in the next decade.

Realistically, no widespread implementation of IoT can be completely ‘even’ and not everyone will experience the same IoT, because this set of technologies (like all others) enters a socio-economic system that is not a blank slate. Rather, a ‘landscape’ of IoT deployment will emerge that may narrow some current socioeconomic and technological divides; but will almost certainly reflect other inequalities and create new ones (Hargittai, 2008). This is not solely an issue for technology providers — it is also a consequence of the way in which different groups interpret and adopt technologies in particular ways. For instance, some groups — such as many lower income single mothers — are likely to remain suspicious of data-collecting devices, because their prior exposure to them has been through surveillance-like interactions that oversee the terms on which they can obtain government benefits (Eubanks, 2011) — even though they may be among those who could benefit most from the insights gained through new forms of IoT data collection.

In most of our provocations, we highlighted the active roles users play — as innovators, data obfuscators, or particularly salient drivers of regulation — sometimes in tandem with, and sometimes in conflict with firms, governments, and other actors. Maximizing the upside promise of an IoT world will require users to do even more — precisely because IoT technology is an increasingly intimate and integrated part of users’ experience. ‘Big data’ tends to operate in the background for most users; the IoT at least in its early manifestations will be much more visibly present in the average person’s everyday experience. The most important prerequisite to more desirable interactions with IoT is broader data literacy among citizens and scrupulous neutrality among data scientists. This data literacy should not only include quantitative skills in how to read and interpret data, but a suite of qualitative humanistic skills as well, such as understanding at least the basic terms of debate around data ethics, legal limits and protections on certain data uses, and how social constructs manifest themselves in data collection and analysis [6]. In our minds, the worst case scenarios to fear the most are not centrally about surveillance. They are about a corrosive cleavage between data sets that are selected, packaged, and visualized in order to promote ideologies; in practice, something like ‘liberal’ and ‘conservative’ data sets and IoT applications. The last thing a democracy needs is to have IoT and the data coming off it enveloped in ideological garb.

This is particularly important when it comes to inconvenient truths — when data suggests (or comes close to proving) hypotheses that overlap uncomfortably with deep normative commitments around sensitive societal issues. There are things that most societies would prefer not to know about themselves at any given moment in time. Those spaces of ignorance are not all bad news and do not necessarily need to be eliminated: they often provide valuable wiggle room that can help avoid conflict. IoT will inevitably intrude on some of these sensitive areas and take away some of that wiggle room. That in turn will place significant new demands on political processes, to manage the consequences of using broader and better data.

The knowledge and meaning implications of that data will remain contentious simply because there is not enough data in the universe to put an end to politics. Rather, politics is going to have to get better and more just — not only in how it produces, manages, and analyzes, but also in how it uses data from a nearly ubiquitous IoT. End of article

 

About the authors

Steven Weber is Professor at the School of Information and Department of Political Science, University of California Berkeley. He is the author of The success of open source (Cambridge, Mass.: Harvard University Press, 2004) and co-author of Deviant globalization: Black market economy in the 21st century (London: Bloomsbury, 2011). He studies the political economy of data and is currently writing How to organize a global enterprise: Economic geography in the post financial crisis world.
E-mail: steve_weber [at] berkeley [dot] edu

Richmond Wong is a Ph.D. student in the BioSENSE Group at the University of California Berkeley School of Information whose work focuses on how emerging technologies are imagined through public discourse, cultural media and policy-making processes.
E-mail: richmond [at] ischool [dot] Berkeley [dot] edu

 

Notes

1. By ‘self-reflexive’ system here we mean a system in which discourse impacts technology evolution — put simply, what people believe and say about the technology directly affects how the technology evolves in a rapid feedback cycle.

2. One does not have to believe that the Target and NSA examples were the most privacy-intrusive acts of recent years (they almost certainly were not; though they were distinguished by their publicity profile) to imagine what kinds of perceived lines will be crossed as IoT deployments move forward.

3. This is particularly true for companies like Google and Facebook that have ‘crossed the line’ with their customers on numerous occasions and have had to reverse course, in part due to fines and additional privacy programs mandated by enforcement actions brought by the U.S. Federal Trade Commission. A reasonable expectation is that the product leadership in these firms will have learned from these experiences. But the general applicability of principals extracted from those ‘lessons’ is suspect and yet to be tested in the IoT environment.

4. These price numbers are meant to be indicative not precise; the point is that the market for home IoT devices is highly price sensitive.

5. Non-technology discontinuities are of course also possible. The obvious example is a major military conflict that involved highly disruptive and/or destructive cyberattacks — this could dramatically shift the environment for IoT development and deployment.

6. We emphasize here the importance of users in addition to the focus on designers, which is more common — for example, see http://en.itu.dk/about-itu/press/news-from-itu/researchers-want-to-make-internet-of-things-ethical.

 

References

AdNauseam, at http://adnauseam.io, accessed 21 August 2016.

Array of Things, 2016, at http://arrayofthings.github.io/, accessed 21 August 2016.

Alessandro Bassi and Geir Horn, 2008. Internet of Things in 2020: A roadmap for the future. Brussels: European Commission, at http://www.smart-systems-integration.org/public/documents/publications/Internet-of-Things_in_2020_EC-EPoSS_Workshop_Report_2008_v3.pdf, accessed 10 January 2017.

Victoria Bellotti and Abigail Sellen, 1993. “Design for privacy in ubiquitous computing environments,” ECSCW ’93: Proceedings of the Third Conference on European Conference on Computer-Supported Cooperative Work, pp. 77–92.

Wiebe E. Bijker, 1995. Of bicycles, bakelites, and bulbs: Toward a theory of sociotechnical change. Cambridge, Mass.: MIT Press.

Theodore Book and Chris Bronk, 2016. “I see you, you see me: Mobile advertisements and privacy,” First Monday, volume 21, number 6, at http://firstmonday.org/article/view/6154/5215, accessed 10 January 2017.
doi: http://dx.doi.org/10.5210/fm.v0i0.6154, accessed 10 January 2017.

Julie Bort, 2014. “This 19-year-old developer is so successful, he turned down Apple,” Business Insider (21 September), at http://www.businessinsider.com/successful-19-year-old-turns-down-apple-2014-9, accessed 22 August 2016.

danah boyd and Kate Crawford, 2012. “Critical questions For big data,” Information, Communication & Society, volume 15, number 5, pp. 662–679.
doi: http://dx.doi.org/10.1080/1369118X.2012.678878, accessed 10 January 2017.

Andrew L. Brooks, 2012. “Information & social networks: Engineering attitudes & behaviors,” CSCW ’12: Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work Companion, pp. 311–314.
doi: http://dx.doi.org/10.1145/2141512.2141608, accessed 10 January 2017.

Finn Brunton and Helen Nissenbaum, 2015. Obfuscation: A user’s guide for privacy and protest. Cambridge, Mass.: MIT Press.

Jenna Burrell, 2016. “How the machine ‘thinks’: Understanding opacity in machine learning algorithms,” Big Data & Society, volume 3, number 1, at http://journals.sagepub.com/doi/pdf/10.1177/2053951715622512, accessed 10 January 2017.
doi: http://dx.doi.org/10.1177/2053951715622512, accessed 10 January 2017.

California Legislature, 2015. “AB-856 Invasion of privacy” (2 July), at http://www.leginfo.ca.gov/pub/15-16/bill/asm/ab_0851-0900/ab_856_cfa_20150713_143110_sen_comm.html, accessed 21 August 2016.

Hubert C.Y. Chan, 2015. “Internet of Things business models,” Journal of Service Science and Management, volume 8, number 4, pp. 552–568.
doi: http://dx.doi.org/10.4236/jssm.2015.84056, accessed 10 January 2017.

Paul Dourish and Genevieve Bell, 2011. Divining a digital future: Mess and mythology in ubiquitous computing. Cambridge, Mass.: MIT Press.

Virginia Eubanks, 2011. “Technologies of citizenship,” In: Virginia Eubanks. Digital dead end: Fighting for social justice in the information age. Cambridge, Mass.: MIT Press, pp. 49–80.

Andrew Fano and Anatole Gershman, 2002. “The future of business services in the age of ubiquitous computing,” Communications of the ACM, volume 45, number 12, pp. 83–87.
doi: http://dx.doi.org/10.1145/585597.585620, accessed 10 January 2017.

Adrienne Porter Felt, Elizabeth Ha, Serge Egelman, Ariel Haney, Erika Chin, and David Wagner, 2012. “Android permissions: User attention, comprehension, and behavior,” SOUPS ’12: Proceedings of the Eighth Symposium on Usable Privacy and Security, article number 3.
doi: http://dx.doi.org/10.1145/2335356.2335360, accessed 10 January 2017.

Fitbit, 2016. “Fitbit,” at http://www.fitbit.com/, accessed 21 August 2016.

Gang Gan, Zeyong Lu, and Jun Jiang, 2011. “Internet of Things security analysis,” Proceedings of the 2011 International Conference on Internet Technology and Applications (iTAP), pp. 1–4.
doi: http://dx.doi.org/10.1109/ITAP.2011.6006307, accessed 10 January 2017.

Gartner, 2016. “Gartner hype cycle,” at http://www.gartner.com/technology/research/methodologies/hype-cycle.jsp, accessed 23 August 2016.

Robert Gellman, 2016. “Fair information practices: A basic history,” version 2.17 (22 December), at http://bobgellman.com/rg-docs/rg-FIPshistory.pdf, accessed 10 January 2017.

Eszter Hargittai, 2008. “The digital reproduction of Inequality,” In: David B. Grusky (editor). Social stratification: Class, race, and gender in sociological perspective. Third edition. Boulder, Colo.: Westview Press, pp. 936–944.

Jürgen Hase, 2012. “The Internet of Things. Alternative business models and best practices,” IoT Week 2012 — IoT Economics Workshop, at https://iotforum.files.wordpress.com/2013/07/alternative-business-models-and-best-practices.pdf, accessed 10 January 2017.

Eric von Hippel, 2005. Democratizing innovation. Cambridge, Mass.: MIT Press.

Jason I. Hong and James A. Landay, 2004. “An architecture for privacy-sensitive ubiquitous computing,” MobiSys ’04: Proceedings of the Second International Conference on Mobile Systems, Applications, and Services, pp. 177–189.
doi: http://dx.doi.org/10.1145/990064.990087, accessed 10 January 2017.

Hewlett Packard (HP), 2015. “Internet of things research study,” at http://www8.hp.com/us/en/hp-news/press-release.html?id=1909050#.WHerX7GZN0c, accessed 23 August 2016.

Florence Hudson, 2016. “The Internet of Things is here,” EDUCAUSE Review, volume 51, number 4, at http://er.educause.edu/articles/2016/6/the-internet-of-things-is-here, accessed 10 January 2017.

Intel, 2016. “A guide to the Internet of Things infographic,” at http://www.intel.com/content/www/us/en/internet-of-things/infographics/guide-to-iot.html, accessed 21 August 2016.

Qi Jing, Athanasios V. Vasilakos, Jiafu Wan, Jingwei Lu, and Dechao Qiu, 2014. “Security of the Internet of Things: Perspectives and challenges,” Wireless Networks, volume 20, number 8, pp. 2,481–2,501.
doi: http://dx.doi.org/10.1007/s11276-014-0761-7, accessed 10 January 2017.

Joshua A. Kroll, Joanna Huey, Solon Barocas, Edward W. Felten, Joel R. Reidenberg, David G. Robinson, and Harlan Yu, 2017. “Accountable algorithms,” University of Pennsylvania Law Review, volume 165.

Butler Lampson, 2009. “Privacy and security: Usable security: How to get it,” Communications of the ACM, volume 52, number 11, pp. 25–27.
doi: http://dx.doi.org/10.1145/1592761.1592773, accessed 10 January 2017.

Michiel de Lange and Martijn de Waal, 2013. “Owning the city: New media and citizen engagement in urban design,” First Monday, volume 18, number 11, at http://firstmonday.org/article/view/4954/3786, accessed 10 January 2017.
doi: http://dx.doi.org/10.5210/fm.v18i11.4954, accessed 10 January 2017.

Eric Limer, 2016. “How hackers wrecked the Internet using DVRs and webcams,” Popular Mechanics (21 October), at http://www.popularmechanics.com/technology/infrastructure/a23504/mirai-botnet-internet-of-things-ddos-attack/, accessed 4 January 2017.

Aleecia M. McDonald and Lorrie F. Cranor, 2008. “The cost of reading privacy policies,” I/S: A Journal of Law and Policy for the Information Society, volume 4, number 3, pp. 540–565.

Chris Megerian, 2015. “Gov. Jerry Brown approves new limits on paparazzi drones,” Los Angeles Times (6 October), at http://www.latimes.com/local/political/la-pol-sac-brown-drones-paparazzi-20151006-story.html, accessed 22 August 2016.

Deirdre K. Mulligan and Jennifer King, 2012. “Bridging the gap between privacy and design,” University of Pennsylvania Journal of Constitutional Law, volume 14, number 4, at http://scholarship.law.upenn.edu/jcl/vol14/iss4/4/, accessed 10 January 2017.

National Conference of State Legislatures (NCSL), 2016. “Current unmanned aircraft state law landscape” (5 January), at http://www.ncsl.org/research/transportation/current-unmanned-aircraft-state-law-landscape.aspx, accessed 21 August 2016.

National Institute of Standards and Technology (NIST), 2004. “Standards for security categorization of federal information and information systems,” Federal Information Processing Standards (FIPS) publication, number 199, at http://csrc.nist.gov/publications/fips/fips199/FIPS-PUB-199-final.pdf, accessed 30 August 2016.

Helen Nissenbaum, 2011. “A contextual approach to privacy online,” Daedalus, volume 140, number 4, pp. 32–48, and at http://www.amacad.org/publications/daedalus/11_fall_nissenbaum.pdf, accessed 10 January 2017.

Jonathan E. Nuechterlein and Philip J. Weiser, 2013. Digital crossroads: Telecommunications law and policy in the Internet age. Second edition. Cambridge, Mass.: MIT Press.

Organisation for Economic Co-operation and Development (OECD), 2013. “The OECD privacy framework,” at http://www.oecd.org/sti/ieconomy/oecd_privacy_framework.pdf, accessed 4 January 2017.

Carlota Perez, 2002. Technological revolutions and financial capital: The dynamics of bubbles and golden ages. Cheltenham: E. Elgar.

Marc Perton, 2015. “CES 2015: The Internet of Things is here, and it may even be useful,” Big Think, at http://bigthink.com/think-tank/ces-2015-internet-of-things, accessed 21 August 2016.

Nick Pidgeon and Baruch Fischhoff, 2011. “The role of social and decision sciences in communicating uncertain climate risks,” Nature Climate Change, volume 1, number 1 (29 March), pp. 35–41, and at http://www.nature.com/nclimate/journal/v1/n1/full/nclimate1080.html, accessed 10 January 2017.
doi: http://dx.doi.org/10.1038/nclimate1080, accessed 10 January 2017.

David Pogue, 2012. “The future is for fools,” Scientific American, volume 306, number 2, p. 29.
doi: http://dx.doi.org/10.1038/scientificamerican0212-29, accessed 10 January 2017.

Ira Rubinstein, 2012. “Regulating privacy by design,” Berkeley Technology Law Journal, volume 26, number 3, pp. 1,409–1,457, and at http://scholarship.law.berkeley.edu/btlj/vol26/iss3/6/, accessed 10 January 2017.
doi: http://dx.doi.org/doi:10.15779/Z38368N, accessed 10 January 2017.

Jathan Sadowski and Frank Pasquale, 2015. “The spectrum of control: A social theory of the smart city,” First Monday, volume 20, number 7, at http://firstmonday.org/article/view/5903/4660, accessed 10 January 2017.
doi: http://dx.doi.org/10.5210/fm.v20i7.5903, accessed 10 January 2017.

Hans Schaffers, Nicos Komninos, Marc Pallot, Brigitte Trousse, Michael Nilsson, and Alvaro Oliveira, 2011. “Smart cities and the future Internet: Towards cooperation frameworks for open innovation,” In: John Domingue, Alex Galis, Anastasius Gavras, Theodore Zahariadis, Dave Lambert, Frances Cleary, Petros Daras, Srdjan Krco, Henning Müller, Man-Sze Li, Hans Schaffers, Volkmar Lotz, Federico Alvarez, Burkhard Stiller, Stamatis Karnouskos, Susanna Avessta, and Michael Nilsson (editors). The future Internet. Lecture Notes in Computer Science, volume 6656. berlin: Springer Verlag, pp. 431–446.
doi: http://dx.doi.org/10.1007/978-3-642-20898-0_31, accessed 10 January 2017.

Jean Scholtz, Marty Herman, Sharon Laskowski, Asim Smailagic, and Dan Siewiorek (organizers), 2001. “Workshop on evaluation methodologies for ubiquitous computing,” at http://zing.ncsl.nist.gov/ubicomp01/, accessed 21 August 2016.

Robert F. Smith, Michael Gregoire, Michael McNamara, Andreas Raptopoulos, and T.K. Kurien, 2016. “The Internet of Things is here,” World Economic Forum (22 January), at https://www.weforum.org/events/world-economic-forum-annual-meeting-2016/sessions/the-internet-of-things-is-here, accessed 21 August 2016.

Daniel J. Solove, 2012. “Privacy self-management and the consent dilemma,” Harvard Law Review, volume 126, number 7, pp. 1,880–1,903, at http://harvardlawreview.org/2013/05/introduction-privacy-self-management-and-the-consent-dilemma/, accessed 10 January 2017.

Daniel J. Solove, 2006. “A taxonomy of privacy,” University of Pennsylvania Law Review, volume 154, number 3, pp. 477–564, and at https://www.law.upenn.edu/journals/lawreview/articles/volume154/issue3/Solove154U.Pa.L.Rev.477(2006).pdf, accessed 10 January 2017.

Zeynep Tufekci, 2014. “Engineering the public: Big data, surveillance and computational politics,” First Monday, volume 19, number 7, at http://firstmonday.org/article/view/4901/4097, accessed 10 January 2017.
doi: http://dx.doi.org/10.5210/fm.v19i7.4901, accessed 10 January 2017.

Stefanie Turber, Jan Vom Brocke, Oliver Gassmann, and Elgar Fleisch, 2014. “Designing business models in the era of Internet of Things: Towards a reference framework,” In: Monica Chiarini Tremblay, Debra VanderMeer, Marcus Rothenberger, Ashish Gupta, and Victoria Yoon (editors). Advancing the impact of design science: Moving from theory to practice. Lecture Notes in Computer Science, volume 8463. Cham, Switzerland: Springer International, pp. 17–31.
doi: http://dx.doi.org/10.1007/978-3-319-06701-8_2, accessed 10 January 2017.

U.S. Federal Trade Commission (FTC), 2013. “Internet of Things — Privacy and security in a connected world” (19 November), at https://www.ftc.gov/news-events/events-calendar/2013/11/internet-things-privacy-security-connected-world, accessed 10 January 2017; more recent version at https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf, accessed 10 January 2017.

Ole Wæver, 1995. “Securitization and desecuritization,” In: Ronnie D. Lipschutz (editor). On security. New York: Columbia University Press, pp. 46–86.

Steven Weber and AnnaLee Saxenian, 2012, “Probing the value of shared data in the modern economy,” Kansas City, Mo.: Kaufman Foundation.

Mark Weiser, 1991. “The computer for the 21st century,” Scientific American, volume 265, number 3, pp. 94–104, and at https://www.scientificamerican.com/article/the-computer-for-the-21st-century/, accessed 10 January 2017.

Daniel Yergin, 1991. The prize: The epic quest for oil, money, and power. New York: Simon & Schuster.

Asta Zelenkauskaite, Nik Bessis, Stelios Sotiriadis, and Eleana Asimakopoulou, 2012. “Interconnectedness of complex systems of Internet of Things through social network analysis for disaster management,” INCOS ’12: Proceedings of the 2012 Fourth International Conference on Intelligent Networking and Collaborative Systems, pp. 503–508.
doi: http://dx.doi.org/10.1109/iNCoS.2012.25, accessed 10 January 2017.

 


Editorial history

Received 5 September 2016; revised 4 January 2017; revised 9 January 2017; accepted 10 January 2017.


Creative Commons License
This paper is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

The new world of data: Four provocations on the Internet of Things
by Steven Weber and Richmond Y. Wong.
First Monday, Volume 22, Number 2 - 6 February 2017
http://firstmonday.org/ojs/index.php/fm/article/view/6936/5859
doi: http://dx.doi.org/10.5210/fm.v22i2.6936.





A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2017. ISSN 1396-0466.