Information and communication technologies (ICT) are changing the way people interact with each other. Today, every physical device can have the capability to connect to the Internet (digital presence) to send and receive data. Internet connected cameras, home automation systems, connected cars are all examples of interconnected Internet of Things (IoT). IoT can bring benefits to users in terms of monitoring and intelligent capabilities, however, these devices collect, transmit, store, and have a potential to share vast amount of personal and individual data that encroach private spaces and can be vulnerable to security breaches. The ecosystem of IoT comprises not only of users, various sensors, and devices but also other stakeholders of IoT such as data collectors, processors, regulators, and policy-makers. Even though the number of commercially available IoT devices is on steep rise, the uptake of these devices has been slow, and abandonment rapid. This paper explains how stakeholders (including users) and technologies form an assemblage in which these stakeholders are cumulatively responsible for making IoT an essential element of day-to-day living and connectivity. To this end, this paper examines open issues in data privacy and security policies (from perspectives of the European Union and North America), and its effects on stakeholders in the ecosystem. This paper concludes by explaining how these open issues, if unresolved, can lead to another wave of digital division and discrimination in the use of IoT.
Constituents of an Internet of Things ecosystem
Privacy and security regulatory and policy status
The impact of disparities in privacy and security policies on the digital divide and discrimination with IoT
During the past few decades Information and communication technologies (ICT) have changed patterns in which humans interact and use machines. The concept of the Internet of Things (IoT) became popular in 1999, with Kevin Ashton, who introduced the phrase “Internet of Things” , seeing radio-frequency identification (RFID)  as one of the prerequisites for IoT at that point. Since then various definitions and descriptions of IoT have emerged. However, essentially, the Internet has always been an “Internet of Things” (Weber and Wong, 2017). More recent ly ICT has enabled connecting more and more devices, even very small ones, to the Internet and to the Cloud. These devices, along with smartphones, tablets, and computers, generate twice as much data today as they did two years ago, and the trend is expected to continue. Hence, the world is undergoing a Big Data evolution. This evolution will become more prominent and bigger as IoT devices will be capable of collecting data passively without any human intervention. Imagine a situation when a restless sleeper, who tosses and turns subconsciously while sleeping, uses an Internet connected mattress. Now in a normal non-IoT connected mattress this trait of restlessness might not be monitored and registered. However, in a mattress that is connected to a larger IoT infrastructure this data will not only be collected but also shared or sold to other stakeholders in an ecosystem to generate insights about sleep patterns and provide recommendations about the right mattress for a given individual. On one hand, Big Data analytics will continue to discover these sorts of hidden patterns, predictions, and correlations in large datasets, which will in turn influence human activities and decisions in a plethora of fields, such as health and wellness, infrastructure and energy management, agriculture, transportation systems, medical research, and home automation. On the other hand, it raises notable concerns in terms of privacy, data security, and consumer protection in general (Garg, et al., 2017).
The Ericsson mobility report of 2017 forecasted around 29 billion connected devices, of which around 18 billion will be related to IoT by 2022 . Connected IoT devices include cars, machines, meters, sensors, point-of-sales terminals, consumer electronics, and wearables. These devices not only effect their manufacturers, operators, and users but also other stakeholders like network providers, regulators, and policy-makers, and even those who do not use them at all. In this paper, I argue that IoT, specifically consumer IoT, is a network of interconnected devices and stakeholders. I analyze several steps from technical, policy, and regulatory perspectives that are yet to be taken by these stakeholders for successful development, deployment, and use of these IoT. To this end, I first identify the stakeholders and examine their responsibilities in an ecosystem of the Internet of Things, then underline open issues from regulatory and policy perspectives that effect these stakeholders in an IoT ecosystem, and finally discuss how, if unresolved, can these issues lead to new wave of digital division and discrimination of users within this ecosystem.
Constituents of an Internet of Things ecosystem
The Internet has been the product of people. It has data (in form of text, images, videos, etc.) generated by, for, and about people. The IoT is transformative in the sense that it follows an “anything connected” vision by the International Telecommunication Union (ITU) .. In other words, IoT enables pervasive and ubiquitous presence of a variety of things and objects around us, which are connected to the Internet and are able to sense the environment, interact, and share data with one another, cooperate, and act to reach common goals (Atzori, et al., 2010). The ecosystem of IoT, thus, comprises of a complex network of technologies, the data these devices generate, and stakeholders (people/users, organizations, and regulators) along with their practices and responsibilities. According to Mark Weiser’s (1999) vision, as reflected below, machines that fit the human environment rather than forcing humans to enter theirs will be most successful in penetrating the daily lives of people.
“The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it. Consider writing, perhaps the first information technology: The ability to capture a symbolic representation of spoken language for long-term storage freed information from the limits of individual memory. Today this technology is ubiquitous in industrialized countries. Not only do books, magazines and newspapers convey written information, but so do street signs, billboards, shop signs and even graffiti. Candy wrappers are covered in writing. The constant background presence of these products of ‘literacy technology’ does not require active attention, but the information to be conveyed is ready for use at a glance. It is difficult to imagine modern life otherwise.” 
IoT devices, by design, are meant to disappear into the fabric of everyday life, by enabling passive data collection without user’s involvement or intervention, thereby aggravating privacy and security concerns (Garg, et al., 2017). Even though the number of commercially available devices is on a steep rise, the uptake of these devices has been slow, and abandonment rapid. In fact, according to a recent study 40 percent of U.S. activity tracker (one of the most popular IoT) owners stop using the device within first six months of ownership (NPD Group, 2017). According to the Gartner hype cycle of 2017, IoT and its associated security, platform, and services are at the peak of inflated expectations, and has been there since 2013 (Velosa, et al., 2017). This means there is more buzz than actual use of the technologies, which is specifically true for the consumer segment for IoT market. This is true despite decline in cost and size of sensors, ubiquitous wireless connectivity, and software tools capable of analyzing large data sets that these devices are capable of producing, and several advantages that IoT adds to the lives of their users (Weber and Wong, 2017).
This paper, therefore, aims to discuss IoT in the light of these questions: What are the implications associated with use of IoT? What are the steps necessary from various stakeholders in an IoT ecosystem that can drive and manifest the use of these devices to its fullest extent? To answer these questions, it is important to identify stakeholders in the ecosystem and understand their respective roles, responsibilities, and liabilities. As the IoT is about data as much (Zaslavsky, et al., 2013), if not more, than about connected things, I will introduce here the relevant stakeholders in association with data flows in its ecosystem as shown in Figure 1. Data in the IoT flows through three main components:
- Hardware (sensors, actuators, processors), systems, and user software applications (for automation, notifications etc.) that sense and collect data.
- Network components that includes networking technologies, protocols (bluetooth, Wi-Fi, cellular) and equipment (gateways) that provide connectivity to ensure devices are functional and interconnected to share data or information.
- Cloud and cloud services, which include access servers, data storage services, and data analysis services. Clouds offer computing and storage capabilities needed to process data from sensors to actuate the smart devices as per need.
Figure 1: Data flow in an IoT ecosystem.
Each of these components of an IoT ecosystem (as shown in Figure 1) can be mapped to one or more stakeholders — organizations, institutions, users, regulators and policy-makers — who manage these components or are affected by it or the ecosystem itself. This ecosystem of IoT, its stakeholders and their interrelations can be further understood with Actor Network Theory (ANT). ANT explores how networks/assemblage  of relationships between human or non-human actors (that can be understood as nodes within these networks) are built or assembled and maintained to achieve a specific objective (Latour, 2005, 1996, 1992). ANT also affords an equal amount of agency to any actant/node (human or non-human) within the assemblage. In the IoT ecosystem smart things, users, organizations, and regulators can be interpreted as actants with equivalent degrees of agency. Technologies “do” things as much as people and organizations “do”, and only by conceptualizing these actants as connected nodes in a dynamic network of interacting entities, can the role and responsibilities of each actant be fully understood (Levy, 2015). With few noteworthy exceptions of Levy (2015) and Sassene (2009) actor-network perspective has not explored the network perspective that exists in the use of any technology. My aim here, however, is not to test or validate the applicability of ANT within this context, but rather use ANT to explain that (a) each stakeholder has equal responsibility within the IoT ecosystem; (b) the relationships between stakeholders are transient and their responsibilities need to be repeatedly “performed” or the network (IoT ecosystem) will dissolve and not function to its full capacity; and, (c) only through studying stakeholders as interconnected entities does a picture of their full engagement in the IoT ecosystem crystallize.
The responsibilities of different stakeholders towards protecting the data as it is generated, processed, and stored are shown in Table 1. Although, every stakeholder has a role to play in privacy and data protection, the interests and preferences of these stakeholders can be varied, and sometimes even conflicting. For example, an IoT data collector has the responsibility of providing “notice and choice” to the users to obtain their consent on the type of the data that can be collected, and regulators and policy-makers have a role to mandate organizations to provide “notice and choice” for more sensitive and private data collection. However, an IoT collector might not prefer this, not only because of its technical infeasibility (e.g., absence of user interface or volatility of the data collected), but also because restricting the data collected can constrain the benefits that can be derived from the data. In other words, role and responsibility of one stakeholder can be in contradiction to interest(s) of one or several other stakeholders in the ecosystem. Therefore, relationships between stakeholders are transient and their responsibilities need to be repeatedly negotiated in the IoT ecosystem to safeguard user data. However, only by prioritizing user’s interest of protecting their sensitive data can it be ensured that IoT is being used and not abandoned by its adopters, specifically due to its privacy implications emerging from continuous and passive data collection. However, this will only happen when there are stricter and well-defined regulations on how data is being collected, stored, shared, and processed .
Table 1: Mapping of stakeholders to components and their role in an IoT ecosystem. Stakeholders Component in the IoT ecosystem Primary role/responsibilities Users Devices, systems, and applications Using a device for various purposes like automation, connectivity, security that generates data (including personal sensitive data) Network and network equipment provider Network Connectivity and data security (encryption) as data transits from devices to cloud and vice versa. IoT data collector Devices, systems, and applications
- Transparency during data collection
- Notices in privacy policies
- Data Security for data at rest and transit
- Defining policies for sharing data with third party
- Collection of de-identified data (specifically personal data)
- Data mapping to be aware of data flows
Cloud and cloud services
- Data mapping to be aware of data flows
- Defining policies for sharing data with third party
- Securing data while at rest
- Logically separating data in case of multi-tenancy
IoT data processor Cloud and cloud services
- Processing collected data for knowledge discovery or performing pre-defined tasks
- Design a scalable and automated audit trail so that sensitive data (at rest and in motion) and data access can be monitored in real time
- Policies and procedure will need to be implemented and tested to ensure transparency during data collection
- Implementing security measures like encryption and pseudo- anonymization
- Conducting risk assessment
- Notifying users in case of security breach notification
Regulators and policy-makers All three components
- Privacy and security regulations
- Consumer data protections
Privacy and security regulatory and policy status
In order to understand open issues in regulations on safeguarding data, which in turn make negotiations of roles and responsibilities contentious, this paper examines current North American and European data protection and privacy laws.
The United States (U.S.) follows a dual system of federal and state sectoral privacy laws. When Congress enacts a privacy law, it allows states to take further action. Even though this decentralized approach of privacy federalism in the U.S enables decisions made at different levels to reflect pluralistic concerns, the presence of sectoral and diverse privacy laws leads to gaps in coverage and applicability (Schwartz, 2015).
In the context of the IoT, the U.S. Federal Trade Commission (FTC) in 2015 hosted a workshop and provided some recommendations in the area of privacy and security in a connected world (U.S. FTC, 2015). In particular it was noted that security risks could be exploited with IoT (1) by enabling unauthorized access and misuse of personal information; (2) facilitating attacks on other systems; and, (3) thereby creating risks to personal safety. Participants also noted that privacy risks may flow from the collection of personal information, habits, locations, and physical conditions over time (U.S. FTC, 2015). The concepts of data minimization and notice and choice from the Fair Information Privacy Principles (FIPP) (Gellman, 2017) were discussed in terms of their importance and the associated difficulty in its translation to the IoT world. It was concluded in the workshop that even though the Commission has the authority to take action against some IoT-related practices, legislation at this stage is premature.
In contrast, in the European Union (EU), a harmonized General Data Protection Regulation (GDPR) by the Council of the European Union was adopted in 2016 (European Commission (EC), 2016) with an intention to strengthen and unify data protection for all individuals in the EU. It will become enforceable to all member countries of EU starting 25 May 2018 without the need for implementing national legislation. Towards establishing an EU digital single market, GDPR is a single set of rules applicable to all member states. The EU GDPR will be applicable, if the data controller (the organization that collects data from an EU resident) or processor (organization that processes data on behalf of data controller) or the data subject (person or individual) is based in the EU. It imposes new obligations upon personal data controllers and processors in terms of notice and consent, privacy by design and default (including data minimization), notification for security and data breaches, and the right to be forgotten, thereby, granting stronger rights to individuals (Garg, et al., 2017). In addition, there has been an agreement made between the U.S. and the EU to provide standards of protection for exchange of personal information in both regions. New arrangements include that personal data transfer will be subject to clear conditions, limitations and oversight, and prevention of generalized access. EU residents will also have the possibility to raise any enquiry or complaint in this context (European Commission, 2016). However, this shifts the power to centralized regulatory bodies, and scholars of EU law have argued that member states can and should be given certain leeway to apply rights protection, if the need arises (Petkova, 2016).
Even though these two positions held by U.S. and EU are grounded in the same principles, they are two opposite approaches to handle intensifying privacy and security implications in the world of Big Data and IoT. While the former approach fosters the innovative capabilities of the IoT by recommending to enact broad privacy legislations that are flexible and technology neutral. The latter, even though it is technology neutral, is specific enough to be applicable in the IoT ecosystem. Therefore, the compliance of the EU GDPR can only be achieved if data privacy and security issues of the IoT, such as unwarranted personal data acquisition and analysis, unwanted identification of the user, opaque aggregation of the data, or undisclosed data storage and use, are appropriately resolved by the stakeholders. All these requirements mandated by EU GDPR revolve around data, which is the common thread that ties and influences all stakeholders in the IoT ecosystem, as was noted earlier. Declaring legislation as premature by the U.S. FTC, on the contrary, forces stakeholders in the U.S. to be compliant to a patchwork of inconsistent privacy laws, thereby raising compliance costs. Several scholars and researchers have over the last decade reiterated how data — Big Data — is the raw material of new sources of immense technical, social, and economic value (Tene and Polonetsky 2013; Zaslavsky, et al., 2013; Wu, et al., 2014). Big Data and its value will grow manifold with increases in the number of sensors and devices that will be interconnected and meant to generate, communicate, share, and access data. As data generated within the IoT ecosystem spreads across organizations, institutions, and international borders, in order to be ready to embrace IoT to its full potential, harmonization of data privacy and security compliance requirements, expectations, policies, regulations, and laws across every region and border is absolutely necessary To this end, in the U.S., the challenge is to revitalize federal legislative involvement in privacy regulations, and in the EU, it is important that regulatory bodies are able to draw from living laboratories for the discovery and testing of new and differentiated solutions (Masing, 2012). Therefore, the most prominent open issues in privacy policies and regulations are as follows:
Scope and definition of personal data
According to Weber and Wong (2017) privacy debates will revolve around personal data flows within IoT. This is specifically true in consumer IoT ecosystems as sensors installed in these things will generate and collect personal data specific to users’ usage, behavior, or preferences. The EU GDPR defines personal data as “ any information relating to an identified or identifiable natural person (‘ data subject’ ); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.” (European Commission (EC), 2016) The only exceptions that exist are for data processed in the context of employment and national security. The U.S. is notable for not having enacted a comprehensive data privacy law, except for few sectoral specific laws like that of the 8. Health Insurance Portability and Accountability Act (HIPPA) , Fair Credit Reporting Act (FCRA) , and 10. Children’s Online Privacy Protection Act of 1998 (COPPA) .
As Garg, et al. (2017) pointed out, in the U.S. data security is regulated through one of the two mechanisms: FTC enforcement or state data-breach notification laws. In 2015, 60 new privacy laws were passed at the state level in the U.S. State privacy laws range from limiting insurers and employers using information about certain medical conditions to prohibition of an employer’s ability to view employees’ social media accounts. This means, in the U.S. there are no comprehensive federal statutes but instead various divergent laws in the states. For example, state data-breach laws have provisions in terms of “who must comply with the law (e.g., businesses, data/information brokers, government entities, etc.); definitions of ‘personal information’ (e.g., name combined with SSN, driver’s license or state ID, account numbers, etc.); what constitutes a breach (e.g., unauthorized acquisition of data); requirements for notice (e.g., timing or method of notice, who must be notified); and exemptions (e.g., for encrypted information)” . However, neither FTC enforcement nor any state-data breach notification laws are clearly applicable to data breaches from IoT devices. For example, if an individual’s biometric data was stolen from a company’s servers, it is contestable whether any state or federal regulator would have the authority to respond.
The reason for this lies in the fact that the all the states within U.S. have different definitions of “personal and sensitive information”. Even though some states, like Arkansas and California, include medical information as part of personal information, this definition only considers an individual’s medical history or medical treatment or diagnosis by a health care professional as personal information. This means fitness, health, or sensitive sensor data will not qualify as personal data. States like Missouri also include “information on mental and physical condition” in the definition of “personal information”. Therefore, sensor data, for example, as generated by a personal-fitness tracking device, might reach a breach in Missouri but not in other states. Solove and Hartzog (2014) pointed out that U.S. privacy law regulates only specific types of data when collected, processed, and stored for specific purposes.
The IoT will increase both sources and amounts of personal data collected. Data aggregation across such multiple data streams makes it more informative about a specific individual’s preferences and behavior. In other words, information can also become “personal data” if the IoT data processor and collector combine data available from other services or sources and use data mining activities to create new knowledge about users that might not be revealed by separately examining the underlying data sets. The information collected based on object identifiers, sensor data, and the connection capabilities of IoT systems might therefore reveal information on individuals, their habits, location, interests, and other personal information and other preferences stored for ease of use in systems . So, if inconsistency in the definition of personal data continues to exist at a global level, it will further limit the effectiveness of any added-on security or privacy measures, as applicability of these measures is dependent on definitions of personal data.
Therefore, regulators and policy-makers need to harmonize and define personal data so that it will be required to anonymize or de-identify this data at the point of collection. Organizations need to even invest in developing more robust and efficient anonymization techniques, so that adversaries with some contextual and background knowledge are not able to re-identify individuals (Narayanan and Shmatikov, 2008). In cases when a degree of identification is required for a specific purpose pseudonymous data should be collected and used for analysis. In both cases it should be ensured, however, that IoT processors are able to search, index, and correlate encrypted personal data. In cases of data breaches, it reduces the risk of sanctions and claims for data collector and processor, and also safeguards an affected individual (Garg, et al., 2017). Each stakeholder in the chain of an IoT ecosystem also need to clarify their responsibilities in terms of if they are just storing or additionally processing data generated by these sensors. This will enforce appropriate privacy risk management and audits by these stakeholders so that sensitive personal data in rest or in motion is monitored in real time. Overall, the very first step towards standardizing privacy laws is an unanimously agreed definition of personal sensitive data that the IoT is capable of generating and must be protected to ensure data protection. Only when this happens can IoT data processors and collectors can and will invest efforts into examining the best approaches to implement privacy and security measures for such data.
Difficulty in implementation of Fair Information Practice Principles (FIPP) and GDPR requirements
The U.S. FTC guidelines’ — FIPP — were established in 1973, and were adopted in the U.S. Privacy Act of 1974, OECD Privacy Principles, and the Asia-Pacific Economic Cooperation (APEC) Privacy Framework . The core principles for FIPPs include transparency, individual participation, purpose specification, data minimization, use limitation, data quality and integrity, security, accountability and auditing to ensure that consumer data is protected. The key requirements of EU GDPR include increased strength in notice and consent for data collection and processing, security breach notification, right to access of data and information about the processing of personal data. Even though both of these principles are well established under U.S. policies and EU regulations (as opposed to the definition of personal data discussed earlier), their implementation and extension in the IoT ecosystem is difficult. The EU GDPR specifically requires that the data controller (IoT data collector) provide the data subject (user) with information about personal data processing in a concise, transparent and intelligible manner, which is easily accessible, distinct from other undertakings between the controller and the data subject, using clear and plain language.
Currently, there has been a trend among some companies in making their privacy policies so vague as to be inscrutable (Johnston, 2014). The implementation of notice and choice within these privacy policies is even more difficult in an IoT ecosystem due to (a) small or no screen availability; (b) passive data collection; and, (c) data collection and its purpose can be optimized in real time based on data analytics performed. In addition, data analytics is performed on data collected from IoT with machine learning algorithms which are inherently opaque in nature, and can lead to discrimination of users, without anyone knowing about it (Burrell, 2016; Eubanks, 2011). This impenetrable logic of algorithms could be in violation of existing regulatory practices in terms of transparency, accountability, and other non-discriminatory requirements. This means that currently the only option the user has is to trust the underlying infrastructure, stakeholders as well as the implementation of privacy policies that the user has agreed to. Therefore, there is an urgent need of developing scalable techniques that automate the tracking and auditing of data so that stakeholders are fully aware of its location and the purposes for which data is being used. This in turn can facilitate sharing of this information transparently with users.
In conjunction, the 2015 U.S. FTC report, “Internet of Things: Privacy & Security in a Connected World,” also states that stakeholders “should examine their data practices and business needs and develop policies and practices that impose reasonable limits on the collection and retention of consumer data. However, recognizing the need to balance future, beneficial uses of data with privacy protection, staff’s recommendation on data minimization is a flexible one that gives companies many options. They can decide not to collect data at all; collect only the fields of data necessary to the product or service being offered; collect data that is less sensitive; or de-identify the data they collect” (FTC, 2015). The problems caused by a lack of uniformed privacy and security expectations, laws, and mandates, and difficulty in their implementation become even more serious when users themselves grudgingly, willingly, or unknowingly compromise their privacy or security in exchange for state-of-the-art personalized services. Therefore, data protection authorities, both is U.S. and EU, have to provide sufficient resources and power to enforce and enable the implementation of a unique data protection law that is agreed upon by all regulators at a global level since the existence of diverse laws and their different interpretations might lead to different levels of privacy.
The impact of disparities in privacy and security policies on the digital divide and discrimination with IoT
The concept of a digital divide gained headway from mid to late 1990s, at the time when the Internet and dot coms booms were well under way in the U.S. Since then the digital divide has been considered by some a thing of past. It has been considered that people who need ICT can afford it, and those who do not have access to the Internet or ICT are assumed to not need digital technologies (Warschauer, 2003). Therefore, the focus has been on an effective integration of ICT into communities, in order to foster social inclusion. However, in the future, it is likely that the IoT will be more prominent in wealthier nations of the world, reinforcing a global digital divide (Dutton, 2014). People with low annual income, struggling to afford a single cellular mobile device, cannot pay additional broadband charges to ensure connectivity for several smart sensors. As billions of everyday objects will be generating data, that can be linked together to personally identifiable records, the lack of uniform privacy regulations and requirements enable IoT device providers, data collectors, and processors to transgress personal privacy boundaries. This in turn leads to unjust algorithmic discrimination and loss of user anonymity, resulting in undemocratic shifts in power in an IoT ecosystem (Winter, 2015). For example, data from an accelerometer and a gyroscope — both of which are sensors to measure simple movements — can be combined to infer a person’s level of relaxation. This knowledge can be used by insurance providers or employers to deny some benefits to the user, if relaxation levels are not as expected. Or an organization can analyze data generated by sensors to categorize users as good or troublesome customers. This sounds to be benign until these categorizations are connected with race or gender. Peppet (2014) mentions that data analysis from the IoT can lead to illegal discrimination against those in protected classes such as race, age, or gender and hidden forms of economic discrimination. Currently, both traditional discrimination laws and information privacy laws are unprepared for these new forms of discriminatory decision-making. Due to the lack of standardized privacy laws and discriminatory consequences of using IoT, IoT adopters also tend to abandon these devices, thereby further increasing a digital divide . Therefore, this paper claims that until there are harmonized privacy and anti-discriminatory regulations in place, a digital divide is bound to happen not only because of the affordability of IoT but also because of the discrimination that IoT data analytics can lead to.
Thie decision to use any technology is a social process. People use and discover new technologies not only based on their personal requirements and preferences but also based on what others have used and recommended to them. Technologies become woven into the fabric of daily life only when they do not require a lot of effort from the user and do not make them prone to non-negotiable risks. For IoT users, this process of using technologies is primarily inhibited by risks posed by a lack of transparency of data collection and processing practices, a lack of standardized privacy and security measures for IoT device makers, data collectors and processors, non-harmonized data protection regulations, and possible discriminatory analysis by data processors. Appropriate and relevant technological safeguards can only be identified if a sound privacy, data protection, and information risk management is conducted. The complexity of the IoT ecosystem leaves many people behind, opening the possibility of a second digital divide. Those who might most benefit from connectivity, automation, and security provided by the IoT might not be able to afford these technologies, which in turn can incapacitate them to participate in many automated daily activities. Meanwhile, some may abandon IoT because of the fear and possibility of falling into a trap of digital discrimination by employers, or other organizations such as financial institutions and insurance providers.
Therefore, an assemblage of stakeholders has to work towards negotiating and deciding on their responsibilities to make the IoT ecosystem more effective and nuanced by developing standards from technical, policy, and regulatory perspectives. Petkova (2016) noted that “for reasons of consistency and uniformity in consumers’ treatment, but also in order to avoid legal challenges in potential cross-border lawsuits, and to save costs from developing technologically differentiated products or services, in cases of multiple jurisdictions that pose different requirements, organizations tend to voluntarily adopt the higher standard.” Organizations who are believed to have more dominant and powerful interests will actually benefit from uniformity in regulations. Today when a prevalence of the IoT is demanding new privacy regulations, regulatory bodies should evaluate which strategies have been successful at the state level and develop a consolidated standardized privacy law. In turn these regulatory developments will motivate an exploration of technological avenues still under-utilized to reduce privacy risks. If these open issues are addressed and implemented, it will not only benefit individuals in terms of data protection but also provide improved compliance and control over personal data.
About the author
Radhika Garg is an assistant professor at the School of Information Studies at Syracuse University. Her research is about understanding decisions users and organizations make around using new and emerging technologies. She also studies policy and regulatory issues around these emerging technologies and their effects on the decisions of those using these technologies.
E-mail: rgarg01 [at] syr [dot] edu
The author thanks First Monday’s editors and peer reviewers for insights and feedback.
2. RFID uses electromagnetic fields to automatically identify and track tags attached to objects.
5. Weiser, 1999, p. 3.
6. In ANT, an assemblage or network is a process and activities built between things and people.
7. “REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation),” Official Journal of the European Union, at http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679&from=EN, p. 119/33.
8. “Summary of the HIPAA privacy rule,” at https://www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html.
9. “Fair Credit Reporting Act,” at https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/fair-credit-reporting-act.
10. “Children’s Online Privacy Protection Rule (‘COPPA’),” at https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/childrens-online-privacy-protection-rule.
11. National Council of State Legislators, “Security breach notification laws” (29 March 2018), at http://www.ncsl.org/research/telecommunications-and-information-technology/security-breach-notification-laws.aspx; see also National Council of State Legislators, “Digital privacy and security: Overview of resources” (22 March 2018), at http://www.ncsl.org/research/telecommunications-and-information-technology/telecom-it-privacy-security.aspx.
12. “IoT privacy, data protection, information security,” at http://ec.europa.eu/information_society/newsroom/cf/dae/document.cfm?doc_id=1753, p. 3.
13. “Fair Information Practice Principles (FIPPS),” at https://www.dhs.gov/sites/default/files/publications/consolidated-powerpoint-final.pdf. “U.S. Privacy Act of 1974,” at https://www.justice.gov/opcl/privacy-act-1974; “OECD Privacy Principles,” at http://oecdprivacy.org; and, Asia-Pacific Economic Cooperation (APEC) Privacy Framework, at https://www.apec.org/Publications/2005/12/APEC-Privacy-Framework.
14. https://www.waze.com/legal/privacy; https://www.digitaltrends.com/mobile/terms-conditions-waze-privacy-accident/.
Luigi Atzori, Antonio Iera, and Giacomo Morabito, 2010. “The Internet of Things: A survey,” Computer Networks, volume 54, number 15, pp. 2,787–2,805.
doi: https://doi.org/10.1016/j.comnet.2010.05.010, accessed 16 April 2018.
Jenna Burrell, 2016. “How the machine ‘thinks’: Understanding opacity in machine learning algorithms,” Big Data & Society (6 January).
doi: https://doi.org/10.1177/2053951715622512, accessed 16 April 2018.
William H. Dutton, 2014. “Putting things to work: social and policy challenges for the Internet of things,” Info, volume 16, number 3, pp. 1–21.
doi: https://doi.org/10.1108/info-09-2013-0047, accessed 16 April 2018.
Virginia Eubanks, 2011. Digital dead end: Fighting for social justice in the information age. Cambridge, Mass.: MIT Press.
European Commission (EC), 2016. “REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation),” Official Journal of the European Union, at http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679&from=EN, p. 119/33, accessed 17 June 2017.
Radhika Garg, Corinna Schmitt, and Burkhard Stiller, 2017. “Information policy dimension of emerging technologies,” SSRN (16 August), at https://ssrn.com/abstract=2943451, accessed 16 April 2018.
doi: http://dx.doi.org/10.2139/ssrn.2943451, accessed 16 April 2018.
Robert Gellman, 2017. “Fair information practices: A basic history,” version 2.18 (10 April), https://bobgellman.com/rg-docs/rg-FIPshistory.pdf, last accessed 4 October 2017.
Casey Johnston, 2014. “Snapchat’s bad security shows how data use policies fail,” Ars Technica (6 January), at https://arstechnica.com/tech-policy/2014/01/snapchats-bad-security-shows-how-data-use-policies-fail/, last accessed 1 November 2017.
Bruno Latour, 2005. Reassembling the social: An introduction to actor-network-theory. Oxford: Oxford University Press.
Bruno Latour, 1996. “On actor-network theory: A few clarifications,” Soziale Welt, volume 47, number 4, pp. 369–381.
Bruno Latour, 1992. “Where are the missing masses? The sociology of a few mundane artifacts,” In: Wiebe E. Bijker and John Law (editors). Shaping technology/building society: Studies in sociotechnical change. Cambridge, Mass.: MIT Press, pp. 225–258.
Karen E.C. Levy, 2015. “The user as network,” First Monday, volume 20, number 11, at http://firstmonday.org/article/view/6281/5116, accessed 16 April 2018.
doi: http://dx.doi.org/10.5210/fm.v20i11.6281, accessed 16 April 2018.
Johannes Masing, 2012. “ Herausforderungen des datenschutzes,” Neue Juristische Wochenschrift, pp. 2305–2311.
Arvind Narayanan and Vitaly Shmatikov, 2008. “Robust de-anonymization of large sparse datasets,” SP ’08: Proceedings of the 2008 IEEE Symposium on Security and Privacy, pp. 111–125.
doi: http://dx.doi.org/10.1109/SP.2008.33, accessed 16 April 2018.
NPD Group, 2017. “U.S. smartwatch ownership poised to catch up with, and potentially surpass, activity trackers, according to the NPD Group,” at http://connected-intelligence.com/about-us/press-releases/us-smartwatch-ownership-poised-catch-and-potentially-surpass-activity, accessed 16 April 2018.
Scott R. Peppet, 2014. “Regulating the Internet of Things: first steps toward managing discrimination, privacy, security and consent,” Texas Law Review, volume 93, pp. 85–176, at https://texaslawreview.org/wp-content/uploads/2015/08/Peppet-93-1.pdf, accessed 16 April 2018.
Bilyana Petkova, 2016. "The safeguards of privacy federalism,"Lewis & Clark Law Review, volume 20, number 2, pp. 595–645, at https://law.lclark.edu/live/files/22074-lcb202art7petkovapdf, accessed 16 April 2018.
Michel J. Sassene, 2009. “Incompatible images: Asthmatics’ non-use of an e-health system for asthma self-management,” In: E. Vance Wilson (editor). Patient-centered e-health. Hershey, Pa.: IGI Global, pp. 186–200.
doi: http://doi.org/10.4018/978-1-60566-016-5.ch014, accessed 13 November 2017.
Paul M. Schwartz, 2015. “The value of privacy federalism,” In: Beate Roessler and Dorota Mokrosinska (editors). Social dimensions of privacy: Interdisciplinary perspectives. New York: Cambridge University Press, pp 324–346.
doi: http://dx.doi.org/10.1017/CBO9781107280557.018, accessed 16 April 2018.
Daniel J. Solove and Woodrow Hartzog, 2014. “The FTC and the new common law of privacy,” Columbia Law Review, volume 114, pp. 583–676, at https://www.columbialawreview.org/wp-content/uploads/2016/04/Solove-Hartzog.pdf, accessed 16 April 2018.
Omer Tene and Jules Polonetsky, 2013. “Big data for all: Privacy and user control in the age of analytics,” Northwestern Journal of Technology and Intellectual Property, volume 11, number 5, article number 1, at http://scholarlycommons.law.northwestern.edu/njtip/vol11/iss5/1, accessed 16 April 2018.
Zeynep Tufekci, 2014. “Engineering the public: Big data, surveillance and computational politics,” First Monday, volume 19, number 7, at http://firstmonday.org/article/view/4901/4097, accessed 16 April 2018.
doi: http://dx.doi.org/10.5210/fm.v19i7.4901, accessed 16 April 2018.
U.S. Federal Trade Commission (FTC), 2015. “Internet of Things: Privacy & security in a connected world,” at https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf, accessed 24 October 2017.
Alfonso Velosa, W. Roy Schulte, and Benoit J. Lheureux, 2017. “Hype cycle for the Internet of Things, 2017, ” Gartner, Inc. (24 July), at https://www.gartner.com/doc/3770369/hype-cycle-internet-things-, accessed 24 October 2017.
Mark Warschauer, 2003. Technology and social inclusion: Rethinking the digital divide. Cambridge, Mass.: MIT Press.
Steven Weber and Richmond Y. Wong, 2017. “The new world of data: Four provocations on the Internet of Things,” First Monday volume 22, number 2, at http://firstmonday.org/article/view/6936/5859, accessed 16 April 2018.
doi: http://dx.doi.org/10.5210/fm.v22i2.6936, accessed 16 April 2018.
Mark Weiser, 1999. “The computer for the 21st century,” ACM SIGMOBILE Mobile Computing and Communications Review, volume 3, number 3, pp. 3–11.
doi: http://dx.doi.org/10.1145/329124.329126, accessed 16 April 2018.
Jenifer Winter, 2015. “Algorithmic discrimination: Big data analytics and the future of the Internet,” In: Jenifer Winter and Ryota Ono (editors). The future Internet. Cham, Switzerland: Springer International, pp. 125–140.
doi: https://doi.org/10.1007/978-3-319-22994-2_8, accessed 16 April 2018.
Xindong Wu, Xingquan Zhu, Gong-Qing Wu, and Wei Ding, 2014. “Data mining with big data,” IEEE Transactions on Knowledge and Data Engineering, volume 26, number 1, pp. 97–107.
doi: http://dx.doi.org/10.1109/TKDE.2013.109, accessed 16 April 2018.
Arkady Zaslavsky, Charith Perera, and Dimitrios Georgakopoulos, 2013. “Sensing as a service and Big Data,” arXiv, arXiv:1301.0159 (2 January), at https://arxiv.org/abs/1301.0159, accessed 16 April 2018.
Received 16 November 2017; revised 1 February 2018; revised 19 February 2018; accepted 20 February 2018.
Copyright © 2018, Radhika Garg. All Rights Reserved.
Open data privacy and security policy issues and its influence on embracing the Internet of Things
by Radhika Garg.
First Monday, Volume 23, Number 5 - 7 May 2018