What happens to my data? A novel approach to informing users of data processing practices
First Monday

What happens to my data? A novel approach to informing users of data processing practices by Bibi van den Berg and Simone van der Hof



Abstract
Citizens increasingly use the Internet to buy products or engage in interactions with others, both individuals and businesses. In doing so they invariably share (personal) data. While extensive data protection legislation exists in many countries around the world, citizens are not always aware (enough) of their rights and obligations with respect to sharing (personal) data. To remedy this gap, users ought to become better informed of companies’ data processing practices. In the past, various research groups have attempted to create tools to this end, for example through the use of icons or labels similar to those used in nutrition. However, none of these tools have gained extensive adoption, mostly because it turns out that capturing privacy legislation in simple, accessible graphics is a complicated task. Moreover, we believe that the tools that were developed so far do not align closely enough with the preferences and understanding of ordinary users, precisely because they are too ‘legalistic’.

In this paper we discuss a user study conducted to gain a better understanding of the kinds of information users would wish to receive with respect to companies’ data processing practices, and the form this information ought to take. On the basis of this user study we found a new approach to communicating this information, in which we return to the OECD’s Fair Information Principles, which formed the basis for (almost all) data protection legislation. We end the paper with a rudimentary proposal for an end user tool to be used on companies’ Web sites.

Contents

Introduction
1. The user study
2. Existing work on improving the accessibility of privacy statements
3. Pros and cons of existing work
4. Developing an alternative approach to communicating companies’ data processing practices: A first iteration
Conclusions

 


 

Introduction

Data protection law stipulates that online companies have an obligation to communicate their data processing practices to end users. They commonly do this by posting a privacy statement [1] on their Web site. However, research reveals that such statements are rarely read by end users, or that end users find the information contained in them too complicated to understand, or too lengthy to read carefully (Arcand, et al., 2007; Beldad, 2011; Bolchini, et al., 2004; Graf, et al., 2010; Jensen and Potts, 2004; Lichtenstein, et al., 2003; Milne and Culnan, 2004; Pan and Zinkhan, 2006; Sheehan, 2005; Turow, 2001) [2]. Therefore, if companies are to share their data processing practices with end users in an effective and meaningful way, it is necessary to improve the way this information is communicated, both in terms of content (what is communicated to end users) and in terms of format (how it is communicated).

To date there is little knowledge of the end users’ requirements in this respect: little research has been done on the kinds of information end users wish to receive with respect to companies’ data processing practices, nor of their preferences with respect to presenting this information. Therefore, the first step in this research was to conduct an empirical study (section 1) among end users to gain insight into their informational preferences in this area. Using an online survey, we have gathered data on the type of information end users want to receive in two popular online contexts: online shopping and social network sites. Moreover, we investigated when and how they want to receive the information.

The second step in this research was to use the outcomes of the survey to define a set of social requirements (section 2) that companies and their technical systems ought to meet in order to satisfy these users’ informational and communicational needs with respect to the companies’ data collection and processing practices.

Finally, we translated the findings of the survey and the social requirements into a way of visualizing such communication. As said, companies generally use privacy statements on their Web sites to inform users of their data processing practices. In recent years, several research groups have investigated alternative ways of communicating this information (section 3) — ways that are more accessible and appealing to consumers. The use of icons is often proposed, but also that of tables comparable to those on food packaging, containing nutrition information. However, one of the most surprising findings of our survey was that users prefer to be informed through the use of ordinary language, possibly with examples, rather than through imagery such as icons. Hence, we attempted to develop an alternative approach to communicating companies’ data collection and processing procedures (section 4).

 

++++++++++

1. The user study

In order to test end user expectations with regard to companies’ communication of data processing practices we developed an online survey to collect data among an audience in various EU Member States.

a. Setup and respondents

The survey [3] was distributed through social media (e.g., Facebook, LinkedIn, Twitter, and the Dutch social network site Hyves), through mailing lists (e.g., Nettime), by announcement in university courses, through e–mail messages to students, and via flyers on campus. A total of 568 respondents (N=568) completed the survey in a period of three weeks in May 2011. Fifty–five percent of the respondents were male, 45 percent were female. Respondents came from various countries in the EU, but the majority came from three countries: Austria (27 percent), Spain (27 percent) and the Netherlands (25 percent) [4]. The respondents’ average age was 26.

The survey consisted of three parts:

  1. Uncovering respondents’ privacy and trust disposition:
    The first part of the questionnaire was dedicated to collecting data on participants’ privacy and trust dispositions, i.e., to measure the respondents’ perceptions of privacy and trust. The questions we used have been tested in other questionnaires before to measure these variables (see, for example, Van de Garde–Perik, 2009). We adjusted them only marginally.

    In this part of the survey we also collected data on the frequency with which respondents look for, and/or read privacy statements, and on how much time they are prepared to invest in reading them. These questions were included to probe the respondents on their experience with privacy statements, and to learn how important privacy statements are in their contacts with online businesses. They also aimed at verifying whether the privacy statement usage of our participants was in line with earlier research, which indicates that most people do not read privacy statements (cf., Arcand, et al., 2007; Graf, et al., 2010; Jensen and Potts, 2004).

  2. Types of communication, timing of communication:
    The main part of the survey focused on gathering data on the kinds of information respondents would like to receive from companies, when they would wish to receive this information (timing) and the ways in which they would wish to receive this information (means and form).

    The questions in this section were tailored to two different situations: online shopping and online social network sites. The reason why we chose to diversify between these two situations is that we hypothesized that users’ expectations with regard to communicating data processing practices may differ depending on the type of relationship (legal versus non–legal; formal versus informal) in an online shop or a social network site.

  3. Demographics:
    In the final part of the questionnaire, demographic information was collected.

b. Results

The first section of the survey revealed that the respondents who participated in this study are (very) concerned about their privacy. For example, 65 percent of the participants said they are concerned that a person can find private information about them on the Internet, while only 12 percent state they are (entirely) unconcerned about this. Moreover, 74 percent are concerned about submitting information on the Internet, because of what others might do with the information, while nine percent are (entirely) unconcerned about this. Finally, 77 percent of the respondents are concerned about submitting information on the Internet, because it could be used in ways they did not foresee, while only seven percent do not worry about this.

When asked whether or not they read companies’ privacy statements on their Web sites, a surprising 88 percent of the respondents claim to check whether companies have such a statement. However, the results differ on the frequency with which they check whether privacy statements are available (see Table 1). What is even more remarkable is the fact that almost 82 percent of the respondents state they have read privacy statements. This contrasts with earlier research done on this topic (as noted earlier), which revealed that users do not, or only rarely, read privacy statements (see the Introduction). Having said that, the frequency with which privacy statements are read diverges (see Table 2). One possible explanation for this finding is that the respondents to our study are (very) privacy–aware — maybe more so than other end users. However, the finding could also be explained by the fact that end users in general may gradually become more concerned about their privacy, and may have started developing strategies to protect their privacy when it comes to sharing personal information on the Internet, due to a more widespread discussion of privacy issues in, for example, social network sites, in the media in recent years [5].

 

Table 1: ‘How often do you check whether companies have a privacy statement on their Web sites?’
AnswerCountPercentage
Always336.19
Regularly12323.08
Sometimes16330.58
Rarely14827.77
Never6612.38

 

 

Table 2: ‘How often do you read privacy statements on companies’ Web sites?’
AnswerCountPercentage
Always213.94
Regularly8315.57
Sometimes16030.02
Rarely17132.08
Never9818.39

 

In the main section of the survey, we asked respondents of which types of data processing and/or collection by companies they would like to be informed, in the two contexts discussed above. The results can be found in Table 3. Interestingly, the survey reveals that the kinds of information respondents say they would like to receive align neatly with the kinds of information businesses are required to communicate as stipulated in (data protection) law. They are most interested to be informed about:

  1. Which of their personal data are collected;
  2. How these data are used (i.e., for what purposes);
  3. Whether or not their data are passed on to third parties;
  4. How the security of their personal data is handled by the company; and,
  5. Whether or not they can object to the use of their personal data [6].

 

Table 3: Types of information (in percentages).
Type of information(Very)
relevant
Neutral(Completely)
irrelevant
When interacting with an online store/a social network site I want to be informed on:
Which of my personal data are collected93905623
Whether my behavior on the Web site is monitored7478181785
Whether cookies are stored on my computer666824211010
Whether my IP address is stored71721918910
How my personal data are used93936422
Whether my personal data are sold to other companies96943511
How the security of my personal data is handled8687111032
Who is responsible for my personal data6170262387
How I can correct personal data7984171254
Whether I can object to the use of my personal data88909832
Whether I can object to the creation of consumer profiles7679181566

 

Next, we asked respondents in what form they would like to be informed of (the types of information) they had stipulated in the previous questions. Respondents could choose from several options: ‘legal text’, ‘everyday speech’, ‘with examples’, ‘with visual information’ [7], or ‘no answer’. The results are presented in Table 4. Please note that, because respondents could choose more than one option, the numbers refer to the total count per option rather than a percentage.

 

Table 4: Preferred form of communication.
ActivitiesFormLegal textEveryday speechWith examplesWith visual informationNo answer
Visit an online store9928518912551
Open an account at an online store14331521913012
Buy at an online store16329720413115
Open an account at a social network site11832221613223
Edit an online profile8331120813438

 

While we did not ask participants to rank their preferences Table 4 does display some form of ordering: respondents state that everyday speech is the most preferred option of all the forms of communication we presented them with. The use of examples came in as the second–best option, and visual information — surprisingly, in light of all the research into privacy icons and labeling — came in third. This latter finding is highly relevant, since it reveals a weakness in many of the current work on improving the accessibility of privacy statements by several different research groups around the world. We will turn to a discussion of that work, and its strengths and weaknesses in the next section.

When opening an account at an online store and buying products or services online, however, legal text is perceived to be more important than visual information, which may be related to the fact that a legal relationship with the online company is anticipated, respectively effectuated.

These findings formed the basis of a novel approach we are currently developing to communicate companies’ data collection and processing practices to end users. Before discussing the current version we will discuss existing work by others in the same area.

 

++++++++++

2. Existing work on improving the accessibility of privacy statements

“Designing a user interface for specifying privacy preferences is challenging for several reasons: privacy policies are complex, user privacy preferences are often complex and nuanced, users tend to have little experience articulating their privacy preferences, users are generally unfamiliar with much of the terminology used by privacy experts, and users often do not understand the privacy–related consequences of their behavior. Designing a user interface for informing users about privacy policies is challenging for many of the same reasons. In addition, this task is complicated by the fact that users have differing expectations about the type and extent of privacy policy information they would like to see.” [8]

Despite these difficulties — or perhaps because of them — several research groups around the world have set out to develop (prototypes of) tools to improve the accessibility of privacy statements for end users in the past decades. We will discuss a number of them in turn.

a. Machine–readable privacy statements

In the early years of the twenty–first century a so–called ‘Platform for Privacy Preferences’ (P3P) was developed to remedy the lack of understanding and willingness to read privacy policies. This platform provides “a standard machine–readable format for website privacy policies.” [9], which should make it easier for end users to make informed choices about sharing personal information with a company through their Web site. Lorrie Cranor, one of the key players in this field, explains the workings of P3P as follows:

“P3P policies were designed both to provide information about website privacy policies that a human might use to make decisions (such as whether or not to shop at a particular website or whether to exercise ‘opt–out’ options), and to facilitate automated decision–making (such as whether to display a privacy warning or whether to block cookies at a particular website).” [10]

P3P focuses predominantly on notice. The machine–readable format can be translated into consumer–friendly privacy statements through what are called ‘user agents’ (Hochheiser, 2002; Reagle and Cranor, 1999). One famous example thereof, also developed by W3C is the ‘Privacy Bird’ [11] (Cranor, et al., 2006), a plug–in for Internet Explorer which can be used to communicate a Web site’s (machine–readable) privacy policy to end users. The plug–in consists of a small bird that is added to the browser’s menu bar, which changes color (red, yellow, green) and has a speaking bubble that changes, depending on the (mis)match between a user’s personal privacy settings and the Web site’s privacy policy. At present Privacy Bird can only be installed as an add–on to Internet Explorer for Windows PCs, which for the time being rather limits its usefulness. Moreover, it requires users to be actively involved by setting privacy preferences and downloading the software. This may also inhibit take–up.

b. Labeling

Another means of communicating privacy policies is through the use of a parallel from the foods industry: using labeling [12] comparable to that on the packaging of food products. Some research has been done in this area to see whether privacy policies could be translated into this type of visualizations, and if so, which requirements such labeling should meet. This research revealed that statements should be short and present no more than seven issues, should use everyday speech, and should use common graphical interfaces that aid in easy memorization of notices for later use (Abrams and Crompton, 2005). In connection with P3P, a privacy nutrition label was developed that builds on insights from, amongst others, nutrition labeling. Information on personal data processing was shown in a grid with colors, providing simplified information by means of easy–to–memorize symbols on three issues, i.e., types of information processed, data processing purposes, and data sharing. Users have rated the privacy nutrition label better than natural language notices and even found them enjoyable to use (Kelley, et al., 2009). Further research on the nutrition–label approach has shown that information finding is positively impacted in terms of accuracy, speed and reader enjoyment (Kelley, et al., 2010).

c. Layered privacy notices

Yet another approach to making privacy notices easier to understand and more accessible to end users is the idea of making them layered. In layered privacy notices there is an increasing level of detail:

“[t]he initial layer, to be used when collecting information where space is tight, alerts the individual to the collection, the major purpose of the collection, and where they can go for additional information. The second layer, condensed notices, assist the individual in understanding a company’s practices and comparing them to other companies’ practices, while the third layer, [a] longer notice, acts as a complete guide for compliance purposes.” [13]

The short notice (the initial layer) can be used on mobile devices. The condensed notice can be provided on Web sites and complete notices may be provided at users’ request or via hyperlinks (Abrams and Crompton, 2005). This approach can complement the visualization of data processing practices through icons or labels by allowing end users to receive more detailed information or explanations if they wish so. The Article 29 Data Protection Working Party has adopted the layered–notice approach in 2004 (A29DPWP, 2004).

d. Privacy icons

In recent years several research groups have attempted to develop icon sets with which to communicate privacy–related information (for a complete overview of these initiatives, please see Hansen, 2009). The central idea is that icons and symbols are easy to understand, and provide information at a single glance, rather than through the use of (long lines of) text. One of the earliest attempts to capture privacy–related information in icons comes from Mary Rundle, who created a set of seven icons, which companies could post on their Web sites, for example to communicate to users that they would not use their data for marketing purposes or that they would not trade or sell users’ data (Rundle, 2006). In the PrimeLife project, an EU FP7 project on privacy and identity in the online world, a much larger set of icons was developed and tested among a user base. This set included icons which, for example, communicated whether or not a Web site was tracking users’ behaviors, facilitating anonymization, and whether or not the data were passed on to third parties. Moreover, even such complex issues as whether or not data were aggregated with personalized third–party information, and whether or not the processing practices of the company fall under EU law or equal protection were captured in icons (Hansen, 2009).

 

++++++++++

3. Pros and cons of existing work

When reviewing the results from all four approaches — i.e., P3P, labeling, layered privacy notices and privacy icons — it turns out that improving the accessibility and readability of privacy notices is not easy. All four approaches have clear merits, yet all have also failed to get enough of an uptake to have a serious impact. For one, turning privacy statements into machine–readable (P3P) statements turns out to be less straightforward than it may appear to be: all sorts of terminological ambiguities and issues regarding usability have arisen in the past years (for an analysis, please see Hochheiser, 2002). Moreover, the use of icons also has its pitfalls. Most importantly, it turns out that capturing complex, detailed material such as data protection legislation in one single image, or a (relatively) limited set of images is incredibly difficult. For example, the icons that were developed in the PrimeLife project consist of circles that often contain several, rather small elements; the icon in Figure 1, expressing ‘tracking’, is a clear example in case: in order to express accurately the issue at stake a rather complicated drawing is required (note that one cannot explain the issues surrounding tracking personal data in one or two words either). Almost all of the icons in this set, and for that matter in the other sets that have been developed as well (Hansen, 2009), unfortunately suffer from this same problem: they attempt to communicate such specified, detailed information that the resulting image becomes too complex to understand at a single glance.

 

One of the icons from the PrimeLife icon set
Figure 1: One of the icons from the PrimeLife icon set (c) Hansen, 2009.

 

The Privacy Bird, a single icon that sits in the browser’s toolbar, does not suffer from this problem. After all, the image of the bird itself does not attempt to communicate any complicated legal matters; it merely indicates whether or not a Web site’s privacy policy matches the settings as defined by the user. However, the Privacy Bird has two problems of its own. First of all, the plug–in may be misleading to users in the sense that it may give them a sense of protection that is unwarranted. As said, the plug–in merely displays whether or not a Web site’s privacy policy matches the end user’s preferences, or if (s)he has not changed the settings, the default preferences. The color green may lead users to think they are (somehow) in a ‘safe’ environment, but if they have set the preferences to a very ‘unsafe’ standard, this sense of safety has no basis. Moreover, the image of the bird does not communicate anything relating to the protection of personal data in and of itself. Cranor, et al. write that they have chosen the image of a bird because of connotations with “a little bird told me” and the canaries that were traditionally used as warning systems in coal mines [14]. However, whether end users will recognize these meanings is dubitable, to say the least. Therefore, to our minds, the Privacy Bird insufficiently visualizes what is at stake in data collection and processing procedures on companies’ Web sites, and does not adequately inform users, in a direct and understandable sense, of their options and, more importantly, their rights.

Labeling, which builds on ideas from nutrition labels on food packaging, does not have this problem. The information provided to users is sufficient and refers directly to the data that are collected or processed about them. However, as the image below reveals (see Figure 2) the labeling approach suffers from a similar problem as the icons discussed above: too much information is provided all at once, which means that end users must invest time and energy to come to an understanding of the table that is presented to them — time and energy they may not be willing to invest.

 

An example of the labeling approach
Figure 2: An example of the ‘labeling’ approach © CyLab Carnegie Mellon 2012.

 

After studying the related work we concluded that using (sets of) single images, or providing privacy–related information at a single glance, to communicate privacy–related information to end users might not be feasible. What’s more important, our survey revealed that end users do not even prefer such visualizations. As a matter of fact, of all the different options for communicating information about a company’s data collection and processing procedures they favor communication through visualizations (icons, symbols etc.) the least. Instead, they prefer to be informed in everyday language, and/or through the use of examples. This is why we set out to develop a tool that avoids the pitfall of over– or under–informing end users through the use of visualizations, and respecting their wish to be informed (primarily) in everyday language.

 

++++++++++

4. Developing an alternative approach to communicating companies’ data processing practices: A first iteration

One of the interesting findings of the survey we conducted (section 1) was that the informational wishes of end users neatly align with the requirements laid down in data protection law [15]: end users tend to want to be informed of the same information processing issues (what information is processed, passing information on to third parties, processing purposes etc.) as the legal demands that companies need to meet. On some level, of course, this is not surprising: if all goes well legal requirements mirror the demands of the people they aim to protect, or at least align with these demands. However, other attempts at improving the accessibility of privacy statements, or companies’ data collection and processing practices, have never started from this finding. As we have seen, for example in the icons developed in the PrimeLife project, these generally start from the assumption that the intricacies of data protection legislation have to be communicated — in great detail — to end users to inform them of the many, many hazards and pitfalls they may (legally) encounter when sharing data in online environments. In contrast, our survey reveals that end users’ expectations remain at a much more general, and much less legally detailed level.

This led us to the idea of going back to the origins of (almost all) of the data protection legislation that is available today: the OECD Guideline, composed in 1980, on the ‘Protection of privacy and transborder flows of personal data’, also known as the Fair Information Principles (OECD, 1980). These Principles form the basis of the European Data Protection Directive, along with most of the data protection legislation of the Member States. There are eight basic principles in the OECD Guideline, ranging from a Collection Limitation Principle (also known as the data minimization principle: one can only collect those data one needs to complete a certain action, and no more than that), the Data Quality Principle (data should be accurate and up to date), and the Purpose Specification Principle (data may only be collected and processed for specified purposes).

As said, the survey revealed that users look for precisely these types of information when engaging with companies who set out to collect and process their data. This is why we rephrased the key principles in the OECD Guideline in everyday language and used those as our starting point.

In the previous section, we concluded that many of the existing initiatives to improve the communication of privacy policies either provided too much information at a single glance for end users to process (icons, labeling) or too little (the Privacy Bird). To avoid this, we decided to opt for a layered approach, which does contain all the information an end user may wish to receive, but not at first glance. Moreover, we decided to use words rather than a single image such as the Privacy Bird to avoid oversimplification. We placed eight core concepts, related to the Fair Information Principles, on the spokes of a wheel, as presented in the image below (see Figure 3). This wheel can be placed on a company’s Web site, as is exemplified in Figure 4. Clicking the wheel makes the spokes rotate, so that each of the eight topics can be studied by end users should they desire to do so.

 

Our alternative approach to communicating information about a company's data processing practices, based on the Fair Information Principles
Figure 3: Our alternative approach to communicating information about a company’s data processing practices, based on the Fair Information Principles.

 

 

The wheel, as it would appear on a Web site
Figure 4: The wheel, as it would appear on a Web site (here: on the site of one of the partners in the ENDORSE research project).

 

The spokes have the following labels:

  1. Limited collection: this is the OECD’s ‘Limited Collection Principle’
  2. Data quality: this is the ‘Data Quality Principle’
  3. Clear purposes: this is the ‘Purpose Specification Principle’
  4. Limited use: this refers to the fact that data shall not be used for purposes other than the ones specified, but also to the fact that data shall be stored for a limited period of time.
  5. Safe & secure: this refers to the OECD’s Security Safeguards Principle, which stipulates that data should be stored in a safe and secure way.
  6. Consent: this is actually not a part of the Fair Information Principles, yet has become a key feature of existing data protection legislation, which is why we chose to create a separate label for it. If this demonstrator were to be developed further into an online tool, to be posted on companies’ Web sites, one could imagine that clicking this spoke would not only give end users access to the stored consent form regarding their data, but possibly even a direct means to change or revoke their consent.
  7. Third parties: this also is not an explicit part of the Fair Information Principles, yet plays an important role in existing data protection legislation. Moreover, the survey revealed that users attach great value to being informed about whether or not their information is passed on to third parties. This is why we created a separate spoke for this theme.
  8. Hold us accountable: this refers back to the OECD’s Openness and the Accountability Principle, which states that users ought to have the right to hold a data controller accountable, and have insight into what data is processed and by whom.

We have chosen these labels because they are intuitive and easy to understand — even if end users do not click on the spokes to find out more information, they are still informed of a company’s data collection and processing practices on a minimal level.

Clicking on the wheel enlarges the image. Next, clicking on the individual spokes enables users to access second and even third layers of information, where they receive more and more in–depth information about each specific aspect of the processing. The information becomes more ‘legalistic’ with every layer the end users access. What’s more, in some cases end users may even exercise their rights directly through the use of this tool. For example, as is shown in Figure 5, when an end user clicks the spoke that reads ‘Limited collection’ an extra layer appears, which informs the end user that the company “only collect[s] data that are necessary for the service [they] offer, and no more!” This is a translation of the OECD’s limitation principle in everyday language. What’s more, in an advanced version of this tool, users could exercise their right to access the information that has been collected about them by clicking on the hyperlink provided.

 

Layered information (and exercising access rights!) though the wheel
Figure 5: Layered information (and exercising access rights!) though the wheel.

 

Each of the spokes provides extra information and/or the ability for end users to exercise their rights in this way. For example, when clicking on the spoke labelled ‘consent’ an end user can access a form such as the one presented in Figure 6, in which she gets an overview of the various purposes for which she has consented her data may be used, and also has the ability to alter this consent: to withdraw consent for purposes previously approved, or agree to purposes for which consent wasn’t given before. Submitting the form then obligates the company to change these settings in its system as of the date stipulated by the end user.

 

Providing or withdrawing consent through the wheel
Figure 6: Providing or withdrawing consent through the wheel.

 

 

++++++++++

Conclusions

In this paper we have presented a novel approach towards presenting information on data processing practices of companies to end users, involving an online clickable wheel with easy–to–understand labels that can be displayed on Web sites. The approach is informed by the results from an online survey into end users’ preferences as to what information on data–processing practices they want, by what means and in what form. Interestingly, the survey revealed that end users’ preferences align perfectly with information that companies are legally obliged to provide them with. Moreover, they prefer text with examples over visualizations, like icons and labels. When entering into a legal relationship, they prefer legal texts.

The instrument we have developed is based on the OECD Fair Information Principles, which have been rephrased into everyday language for easy understanding. The proposed approach mitigates or removes drawbacks of other approaches that aim to improve transparency of data processing practices of companies. The clickable wheel visualizes information relevant to end users but does not process too much information for end users to process at first glance (like icons or labels) nor too little (like in case of the Privacy Bird). Moreover, the use of words rather than a single image avoids oversimplification. The layered approach — clicking on the spokes of the wheel — allows end users to receive more detailed information, or even in–depth information by clicking again, if they wish so. End of article

 

About the authors

Dr. Bibi van den Berg is assistant professor at eLaw@Leiden, the Centre for Law in the Information Society at Leiden University’s Law School.
Web: www.bibivandenberg.nl
E–mail: b [dot] van [dot] den [dot] berg [at] law [dot] leidenuniv [dot] nl

Prof. Dr. Simone van der Hof, LL.M., is professor of Law and the Information Society at eLaw@Leiden the Centre for Law in the Information Society at Leiden University’s Law School.
E–mail: s [dot] van [dot] der [dot] hof [at] law [dot] leidenuniv [dot] nl

 

Acknowledgments

This work was funded in part by the European Union’s 7th Framework Program in the ENDORSE project, nr. 257063. For more information on ENDORSE, please visit http://ict-endorse.eu/.

While this work was conducted both authors worked at the Tilburg Institute for Law, Technology and Society (TILT) at Tilburg University in the Netherlands.

 

Notes

1. The terms privacy statements and privacy policies are often used interchangeably, but have different meanings. Privacy policies are “internally focused tools describing how an organization intends to achieve the [data protection] principles set out [in personal data protection legislation] and a clear means to provide for accountability” (Robinson, et al., 2009). By contrast, privacy statements are “externally facing tools supporting objectives of transparency, [which] would alert individuals at an appropriate time and context as to how their personal data is being used” (Robinson, et al., 2009).

2. Note that, from a business perspective, privacy statements may be used as an instrument of (defensive) risk management intending to thwart legal liability, and companies, hence, have an interest in legalistic texts. See Pollach (2007) and O’Neill (2006).

3. The questionnaire was developed in LimeSurvey (http://www.limesurvey.org/) and distributed in four different languages: Dutch, English, German, and Spanish.

4. The reason why these three countries were represented so strongly is because the survey was most actively distributed there through the network of researchers working on the ENDORSE project.

5. Cf., Young and Quan–Haase, 2009, pp. 270–271). See also Special Eurobarometer 359 (“Attitudes on data protection and electronic identity in the European Union”) (published June 2011, at http://ec.europa.eu/public_opinion/archives/eb_special_359_340_en.htm), according to which 58 percent of respondents read privacy statements, a third not only read but also understand them, a quarter read but do not fully understand them, eight percent ignore them, and five percent do not know where to find them. Seventy percent have adapted their online behavior after reading them by becoming more cautious or not using the service at least once; 41 percent of those do not read privacy statements think is sufficient that Web sites have a privacy statement.

6. The Eurobarometer, supra footnote 7, provides data on, among other things, end users’ awareness, perceived control, and expectations of organizations holding their personal data, but does not directly address the question of what information end users want to receive from companies, and how. However, 49 percent of the respondents using social networking or other sharing sites feel sufficiently informed, but 46 percent say they are not.

7. This was clarified as ‘through the use of, for example, symbols or icons’.

8. Cranor, et al., 2006, p. 141.

9. Cranor, et al., 2006, p. 135.

10. Cranor, et al., 2006, p. 139.

11. See www.privacybird.org.

12. See, for example, http://cups.cs.cmu.edu/privacyLabel/.

13. Abrams and Crompton, 2005, p. 2.

14. Cranor, et al., 2006, p. 156.

15. Note that the approach presented in this section is still in the demo version. A working prototype is currently under development.

 

References

Article 29 Data Protection Working Party (A29DPWP), 2004. “Opinion 10/2004 on more harmonized information provisions,” at http://ec.europa.eu/justice/data-protection/article-29/index_en.htm, accessed 27 June 2012.

M. Abrams and M. Crompton, 2005. “Multi–layered privacy notices: A better way,” Privacy Law Bulletin volume 2, number 1, pp. 1–4.

M. Arcand, J. Nantel, M. Arles–Dufour, and A. Vincent, 2007. “The impact of reading a Web site’s privacy statement on perceived control over privacy and perceived trust,” Online Information Review, volume 31, number 5, pp. 661–681.http://dx.doi.org/10.1108/14684520710832342

A. Beldad, 2011. Trust and information privacy concerns in electronic government. Enschede: University of Twente.

D. Bolchini, Q. He, A. Anton, and W. Stufflebeam, 2004. “I need it now: Improving website usability by contextualizing privacy policies,” In: N. Koch, P. Fraternali and M. Wirsing (editors). ICWE 2004. Lecture Notes in Computer Science, volume 3140. Heidelberg: Springer, pp. 31–44.

L. Cranor, P. Guduru, and M. Arjula, 2006. “User interfaces for privacy agents,” ACM Transactions on Computer–Human Interaction, volume 13, number 2, pp. 135–178.http://dx.doi.org/10.1145/1165734.1165735

C. Graf, P. Wolkerstorfer, K. Kristjansdottir, and M. Tscheligi, 2010. “What is your privacy preference? An insight into users’ understanding of privacy terms,” paper presented at NordiCHI2010 (Reykjavik, Iceland, 16–20 October).

M. Hansen, 2009. “Putting privacy pictograms into practice: A European perspective,” GI Jahrestagung, volume 154GI, pp. 1,703–1,716.

H. Hochheiser, 2002. “The platform for privacy preference as a social protocol: An examination within the U.S. policy context,” ACM Transactions on Internet Technology, volume 2, number 4, pp. 276–306.http://dx.doi.org/10.1145/604596.604598

C. Jensen and C. Potts, 2004. “Privacy policies as decision–making tools: An evaluation of online privacy notices,” CHI ’04: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 471–478.

P. Kelley, L. Cesca, J. Bresee, and L. Cranor, 2010. “Standardizing privacy notices: An online study of the nutrition label approach,” Carnegie Mellon University, CyLab, Technical Reports, CMU–CyLab–09–014, at http://www.cylab.cmu.edu/research/techreports/2009/tr-cylab09014.html, accessed 27 June 2012.

P. Kelley, J. Bresee, L. Cranor, and R. Reeder, 2009. “A ‘nutrition label’ for privacy,” paper presented at Symposium On Usable Privacy and Security (SOUPS) 2009, at http://cups.cs.cmu.edu/soups/2009/proceedings/a4-kelley.pdf, accessed 27 June 2012.

S. Lichtenstein, P. Swatman, and K. Babu, 2003. “Adding value to online privacy for consumers: Remedying deficiencies in online privacy policies with an holistic approach,” Proceedings of the 37th Annual Hawaii International Conference on System Sciences, pp. 1–10, and at http://dro.deakin.edu.au/view/DU:30005153, accessed 27 June 2012.

G. Milne and M. Culnan, 2004. “Strategies for reducing online privacy risks: Why consumers read (or don’t read) online privacy notices,” Journal of Interactive Marketing, volume 18, number 3, pp. 15–29.http://dx.doi.org/10.1002/dir.20009

OECD, 1980. “Guidelines on the protection of privacy and transborder flows of personal data,” At http://www.oecd.org/document/18/0,3343,en_2649_34255_1815186_1_1_1_1,00.html, accessed 27 June 2012.

O. O’Neill, 2006. “Transparency and the ethics of communication,” In: C. Hood and D. Heald (editors). Transparency: The key to better governance? New York: Oxford University Press, pp. 75–90.

Y. Pan and M. Zinkhan 2006. “Exploring the impact of online privacy disclosures on consumer trust,” Journal of Retailing, volume 82, number 4, pp. 331–338.http://dx.doi.org/10.1016/j.jretai.2006.08.006

I. Pollach, 2007. “What’s wrong with online privacy policies?” Communications of the ACM, volume 50, number 9, pp. 103–108.http://dx.doi.org/10.1145/1284621.1284627

J. Reagle and L. Cranor, 1999. “The platform for privacy preferences,” Communications of the ACM, volume 42, number 2, pp. 48–55.http://dx.doi.org/10.1145/293411.293455

N. Robinson, H. Graux, M. Botterman, and L. Valeri, 2009. “Review of the European Data Protection Directive,” RAND Technical Reports, number TR–710, at http://www.rand.org/pubs/technical_reports/TR710.html, accessed 27 June 2012.

M. Rundle, 2006. “International data protection and digital identity management tools,” paper presented at IGF 2006: Privacy Workshop, Athens, at http://ssrn.com/abstract=911607, accessed 27 June 2012.

K. Sheehan, 2005. “In poor health: An assessment of privacy policies at direct–to–consumer Web sites,” Journal of Public Policy Marketing, volume 24, number 2, pp. 273–283.http://dx.doi.org/10.1509/jppm.2005.24.2.273

J. Turow, 2001. “Privacy policies on children’s Web sites: Do they play by the rules?” at http://www.asc.upenn.edu/usr/jturow/release.html, accessed 27 June 2012.

E. van de Garde–Perik, 2009. “Ambient intelligence & personalization: People’s perspectives on information privacy,” Ph.D. dissertation, Department of Industrial Design, Eindhoven University of Technology, Eindhoven.

A. Young and A. Quan–Haase, “Information revelation and Internet privacy concerns on social network sites: A case study of Facebook,” C&T '09: Proceedings of the Fourth International Conference on Communities and Technologies, pp. 265–274.

 


Editorial history

Received 19 March 2012; accepted 27 June 2012.


Creative Commons License
“What happens to my data? A novel approach to informing users of data processing practices” by Bibi van den Berg and Simone van der Hof is licensed under a Creative Commons Attribution–NonCommercial–ShareAlike 3.0 Unported License.

What happens to my data? A novel approach to informing users of data processing practices
by Bibi van den Berg and Simone van der Hof
First Monday, Volume 17, Number 7 - 2 July 2012
https://firstmonday.org/ojs/index.php/fm/article/view/4010/3274
doi:10.5210/fm.v17i7.4010





A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2020. ISSN 1396-0466.