Simple online privacy for Australia
First Monday

Simple online privacy for Australia by Margaret Jackson, Jonathan O'Donnell, and Joann Cattlin

Simple Privacy provides a system for Australian organisations to create privacy policies for the personal information they collect online. The privacy policies it creates are legally compliant and easy to understand. We developed this system because small Australian organisations seemed to find privacy policies too complicated to manage with the resources they have available.

This paper describes the framework behind Simple Privacy and discusses the choices that we made during development. These choices balance the requirements of the privacy legislation and the needs of both organisations and customers.


Previous research
Data protection principles
Past work to simplify privacy
The Creative Commons model
Simple Privacy development




[M]ost people simply require a series of simple rules, ultimately backed up by the law. The simpler the rules and the fewer in number, the more likely it is that fair practices will result (Kirby, 1986).

Web users consistently report that they are very concerned about their privacy (Australia Office of the Privacy Commissioner, 2004; Wallis Consulting Group, 2007; Lenhart and Madden, 2007; Madden and Smith, 2010). Their main concern is that others may use their information without their permission or knowledge. Many do not understand what controls they have over their privacy (Debatin, et al., 2009; Raynes–Goldie, 2010).

Privacy policies should provide users with reassurance by outlining what an organisation will do with their information. However, most users report that privacy policies are too long and too complex for them to understand (McDonald and Cranor, 2009; Singh, et al., 2011). Moreover, many businesses, particularly small businesses, often do not understand the requirements of privacy legislation (Shelly and Jackson, 2009).

Based on previous research, the authors constructed a privacy policy generator, Simple Privacy ( It provides:

  • Web site owners with legally compliant privacy policies.
  • Web site users with simple policies that they can easily understand, along with more detailed policies if they require further information.

In doing so, we had to make decisions that balanced the usability of the system with the complexity of privacy requirements. We needed to decide what the scope of the system would be, what would be included and what would be excluded.



Previous research

In 2008, the authors reviewed 40 Web sites of Australian small businesses based in Australia. We assessed whether the content on those Web sites conformed with legal requirements, including privacy requirements. The study found, amongst other things, that many of the sites did not comply fully with data collection and privacy obligations. For example, we found instances where organisations said that they were bound by the Australian Privacy Principles when, in actual fact, they weren’t bound by them at all. As well, some privacy policies had been copied from overseas sites and referred to legislation from other jurisdictions. It seemed that the organisations wanted to do the right thing, but did not understand their obligations (Jackson, et al., 2006; Shelly and Jackson, 2009).

We then met with Web developers to talk about privacy issues. While they were aware of their customer’s need for a privacy statement on a Web site, they expressed confusion over the complexities of the privacy law in Australia and how it applied to different sorts of organisations and different sorts of activities, such as offshore transactions. Again, we saw that there was a desire to do the right thing, but a lack of understanding of Australian law.

Much of the complexity in privacy and data protection law, in Australia at least, arises from the exemptions and exceptions built into the legislation and the fact that the public, private and health sectors have different data protection obligations. For example, there are different rules depending on whether the organisation is a health provider, financial institution, real estate agency or telecommunication company, and exemptions from the Act depending on the size of the business. As a result, privacy policies can be difficult to understand and often do not state clearly which laws apply nor what, if any, remedies are available to a data subject if their data is misused (European Commission, 2010).

However, most small businesses only need fairly short privacy policies to describe how they manage the personal information they collect. They, and most consumers, only need a series of simple rules that are in accordance with the law (Jackson, et al., 2006; Shelly and Jackson, 2009). The simpler the rules and the fewer in number, the more likely it is that fair practices will result (Kirby, 1986). This is true for many areas of the law and copyright provides a salient example.

In 2002 copyright was also seen as being too complicated and too difficult for most small organisations, particularly in the online environment. Since then the Creative Commons licence model has simplified copyright enormously (Lessig, 2003; Forsythe and Kemp, 2008). Creative Commons has made it possible for organisations and individuals to correctly apply copyright law to their own works internationally, within the national laws and international treaties governing the area (Hietanen, 2008). While not solving all the issues related to copyright, it has provided greater clarity for both content creators and content users (Elkin-Koren, 2006).

The development and adoption of Creative Commons licences provides a model for creating a simplified legal framework that encourages greater understanding and better implementation of complex legal issues. As Patricia Abril (2010) states, “the Creative Commons model could do for intimacy and human dignity what it has heretofore accomplished with creativity.” In Simple Privacy, we have applied this model to privacy policies. This system allows an organisation to build a sound, legally binding privacy policy. It simplifies the legal responsibilities, eases the burden on organisations and provides greater protection for individuals.



Data protection principles

Privacy policies are based on data protection principles. Data protection principles were developed during the 1970s due to concerns about the use of computer technology, primarily mainframe computers, to store information about citizens and customers. Businesses and governments felt that these concerns might lead to barriers to the free flow of information across borders. Data protection principles seek to establish rules about how an organisation and business can collect, use, store and disclose information that can identify an individual. They also grant rights to individuals to access and to request corrections to their personal information and to be advised when information about them is collected (where practicable). In some jurisdictions, they may grant the right for individuals to deal anonymously with organisations (if lawful and practicable). On the whole, they provide individuals only limited control over whether or not information about them can be collected, used or disclosed.

There are a three key sets of data protection principles that have global influence — the 1980 OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (revised in 2013), the 1981 Council of Europe Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data and the 1995 European Union Directive on the protection of individuals with regard to the processing of personal data and on the free movement of such data, currently under review. There is a fourth, the APEC Privacy Framework, which has been developed but it is not a robust framework at this stage (Greenleaf, 2012). Australia’s privacy principles, contained in the Privacy Act 1988 (Cth), are based on the OECD Principles.

Whatever the source of the principles, they are designed to regulate how personal information is handled by an organisation. They cover matters such as notice about what data is being collected, the purpose of the collection, disclosure possibilities, the rights of access to this data by data subjects and correction of it, retention policies, rights about secondary uses, security measures, transborder transfers, how sensitive data is to be handled, and consent options. They also involve differing rules depending on the type of data being collected.

The key features of data protection that are important to organisations, particularly when operating online, is the easy collection of customer data that will enable them to provide the requested service and, in most cases, the use of that data to increase sales, whether by direct marketing or by indirect methods. For individual users, the important principles are consent by individuals for all uses of the data, including secondary uses such as direct marketing, allowing individuals to provide only parts of their personal information, particularly that information which falls into the categories of sensitive information, and allowing the individual to control how the information provided is used, thus preserving their personal and digital space, through, for example, monitoring of compliance with privacy policies and statements of use, and built-in privacy enabling technologies (PETs). For instance, informed consent from an individual about how their information can be used will generally mean that all data protection requirements have been satisfied and so identifying ways of obtaining consent need to be a key priority in any data protection approach.



Past work to simplify privacy

Our project is not the first attempt to try to simplify privacy (Raab and Koops, 2009), nor to simplify the law in the online environment. The following sections discuss previous attempts. Where possible, we have tried to build on the lessons of the past in developing Simple Privacy.

Platform for Privacy Preferences (P3P)

In 1996–97, the World Wide Web Consortium (W3C) launched the Platform for Privacy Preferences (P3P) initiative.

The Platform for Privacy Preferences (P3P) sought to allow Web sites to specify personal data use and disclosure practices; Web users to specify their expectations concerning personal data disclosure practices; and software agents to undertake negotiation, on behalf of the parties, in order to reach an agreement concerning the exchange of data between them (Clarke, 1998; Cranor, et al., 2006).

The first formal specification, Platform for Privacy Preferences 1.0 (P3P1.0) Specification, was published on 16 April 2002. With the publication of an update (P3P1.1) as a W3C Working Group Note on 13 November 2006, activity in this area effectively ceased due to ‘insufficient support from current Browser implementers’ (World Wide Web Consortium, 2011).

P3P was important because it introduced the idea that policies could be expressed in machine-readable terms as well as in human-readable terms. It showed that the machine-readable terms could be harvested by search engines, and that they could be used for the basis of automated negotiation between browsers and Web sites.

However, a 2008 review found that most P3P policies have discrepancies with their natural language counterparts. Some of these discrepancies can be attributed to ambiguities, while others cause the two policies to have completely different meanings (Cranor, et al., 2008).

Despite this, P3P remains the only formally defined language for expressing privacy requirements in machine-readable form. As such, research and development continues in specific situations, such as service aggregation (Dong, et al., 2011) and privacy access control models (Bekara, et al., 2010; Ghazinour and Barker, 2011).

During development, we examined the possibility of creating machine-readable Simple Privacy policies using P3P. While we recognised the value of machine-readability and saw that there was some compatibility between Simple Privacy and P3P, we finally decided that this would need to wait for future development rather than concentrate resources on it during the development phase.

Privacy seals

From 1997, organisations began to introduce privacy seals such as TRUSTe, CPA WebTrust and BBBOnline (Moores, 2005). On a fee-for-service basis, the seal providers warranted that an organisation had a suitable privacy policy in place. If it became apparent that the privacy policies had not been followed, or had proved inadequate, the organisation providing the privacy seal would recommend changes or retract the seal.

These seals were not widely adopted by smaller companies, and the auditors were criticised for being too lenient when breaches occurred (Edelman, 2011).

In 2014, the U.K. Information Commissioner’s Office undertook consultation on introducing an accredited privacy seal process. In 2015, they announced that they would be implementing a system to accredit third party providers, aiming for a starting date in 2016 (Information Commissioner’s Office, 2016, 2014).

Our project does not seek to be an intermediary between information collectors and information providers. Rather, it seeks to provide a free and independent set of policies.

Privacy policy generators

In 1999, the Organisation for Economic Co-operation and Development (OECD) developed a very comprehensive privacy policy generator (Carblanc, 1999; Organisation for Economic Co-operation and Development, 2016; OECD Working Party on Information Security and Privacy, 2006). At about the same time, the Direct Marketing Association (DMA) in the United States also introduced an online tool that would help people to generate a privacy policy (Direct Marketing Association, 2016). Both of these tools provide a privacy policy after a questionnaire has been completed.

Unfortunately, the European system was far too complex. The OECD retired their privacy policy generator around 2010. The U.S. system is modelled on U.S. legislation and does not meet the needs of other countries. Since introducing the original privacy policy generator, it has also introduced generators for children’s privacy and the Gramm-Leach-Bliley Act of 2000. In addition, the DMA is not viewed as a disinterested party, as it represents the interests of direct marketers.

For Australia, there are existing privacy policy generators, LawPath; LawLive; My Privacy Policy; and iubenda.

LawPath ( uses a straightforward interface to gather a minimum of information: contact information about your organisation, your Web site and whether you send information offshore. It then generates a PDF that incorporates this information. Due to the paucity of information collected, the resulting privacy policy does not seem adequate for Australian organisations. LawPath is a subscription service that charges AU$49 per month, although the first policy generated is free.

LawLive ( provides templates for legal documents online, including a Web site privacy policy. The cost is AU$86.90. The privacy policy is provided in Microsoft Word format and can be changed at any time. It covers links to third party sites, user registaration, credit card information and collection of IP addresses. While these are useful questions to define the collection of personal information, they do not touch upon the use of the information once it has been collected.

My Privacy Policy ( uses a straightforward interface to gather information on what services a site provides, such as contact forms, third party sign-in, cookies, RSS feeds, direct marketing, and other services. This provides a good way to audit the information gathering practices of a site. My Privacy Policy then provides text related to each of the services chosen. The cost is AU$99.

Iubenda ( is an international service that uses a sophisticated interface to gather information on the services that a Web site uses, such as mailing list, Google analytics or other automated services that might gather information. This is a clever and effective way of surveying what information is collected, although the range of potential services that can be included is somewhat daunting. It uses a three-layered approach, as we have, and also hosts the privacy policy on their site so that it cannot be tampered with. If it was customised to Australian law, then it would do an excellent job of working out what information is collected. Unfortunately, there is no way to customise the service to a particular jurisdiction, and it does not deal with what happens to the information once it is collected. Iubenda is a subscription service that charges $27 per year, although the first policy generated is free.

These services have chosen to concentrate on the personal information that a site collects, rather than what a site does with that information. We have chosen to focus on what the site does with that information.

Other approaches

There were two other approaches that shaped our thinking: Privacy Icons and Privacy Rulesets.

Mozilla’s Privacy Icons sought to reduce privacy policies to a series of immediately recognizable, machine-readable icons. These icons represented answers to five different privacy requirements: intended use; sale of data; passing to third parties; length of storage; and passing to law enforcement.

From this project, we learnt that icons can be quite complex and not instantly meaningful. This raises issues about how to communicate the meaning of a policy at a glance.

At the same time that Mozilla was developing their Privacy Icon proposal, the Center for Democracy & Technology developed Privacy Rulesets (Cooper, et al., 2010). Privacy Rulesets provide a set of rules that seek to define the minimum requirements for expressing privacy preferences online.

Privacy Rulesets define three main conditions with ten sub-conditions in total. The key elements are Sharing: internally, with affiliates, with other organisations, publicly; Secondary use: related to primary purpose, that builds on primary purpose, marketing; Retention: 35 days, 35 days plus a short limited time, 35 days plus a long or indefinite time.

Privacy Rulesets provide elements (sets of rules) that can be attached to specific pieces of data, such as an e-mail address, so that people can indicate their privacy preferences to organisations. This reverses the usual model of information collection, where organisations seek to collect information under one uniform policy. While we understand the power of this approach, we chose to work within the standard framework of privacy policies provided by collecting organisations.



The Creative Commons model

While Simple Privacy was influenced by other privacy policy generators, the main inspiration came from Creative Commons. In 2001 Laurence Lessig established Creative Commons, a non-profit U.S. corporation. Its objective is to make copyright licensing easier. It states:

Creative Commons licenses are not an alternative to copyright. They work along side the copyright and enable you to modify your copyright terms to best suit your needs (Creative Commons, 2016a).

Creative Commons gives the creator of a work a selection of four conditions and combinations of them under which they can licence their work. These four elements are:

  1. Attribution: All licensees must acknowledge authorship.
  2. Use: Commercial versus non-commercial use.
  3. Modification: Verbatim copying versus modifications and derivative works.
  4. Future licencing: Must derivative works carry the same licencing conditions as the originals?

An author has to complete some simple questions about the type of copyright use they might allow, from which one of the six contracts will be generated for them. These six standard form licences are represented by icons (Creative Commons, 2016c).

Each licence has three layers. The first is described as ‘legal code’ and is the formal legal agreement. The second is the ‘human readable’ version which is the same agreement written in plain English. The third layer is a ‘machine readable’ version that has the key obligations in each agreement written in a format that can be understood by search engines and other software (Creative Commons, 2016b).

The licences are stored on the Creative Commons server, which prevents organisations and individuals from changing the text and rendering them legally invalid.

For our project the three most important elements of the Creative Commons model are: limited choice in licence conditions; standard licence agreements stored on a central server; and a layered approach with icons, simple statements and comprehensive legal agreements.



Simple Privacy development

We adopted the three key aspects of the Creative Commons model: limited options, standard agreements and layered policies, as the starting point for our Framework. We also adopted the multilayered approach for privacy policy development and presentation recommended by the 25th International Data Protection Conference 2003 and the European Union’s Article 29 Working Party in 2004 (Center for Information Policy Leadership, 2015; Abrams and Crompton, 2005). Our aim was to generate an icon, a short summary of each privacy policy able to be viewed on one screen, with a link to the detailed privacy policy.

The Simple Privacy framework consists of:

  • Questions about the organisation: business engaged in; governing jurisdiction; and organisational contact.
  • Questions about information handling: privacy contact details; information collected; security practices; process to change information; and information disclosure practices.

By answering these questions, Web site owners gain access to:

  • An icon, P* to P6, that provides experienced users with information about information handling practices at a glance.
  • A short privacy statement that encapsulates the key features of their information handling practices in one page.
  • A full privacy policy that sets out the rights and responsibilities of the Web site owner and the consumer.

The short privacy statement and the full policy are held on the Simple Privacy site, and linked to from the Web site owner’s site. The Web site owner cannot change the text. This was a deliberate decision, modelled on the Creative Commons framework, because changes by Web site owners can render the policy legally non-compliant.

We divided the research steps into four main stages. The first stage involved the development of a three-layered privacy framework. The second stage involved testing the Framework with a small number of individuals and organisations, using prototyping. We then incorporated the testing data back into the framework. Our third stage was to present the Framework to a broader audience, including privacy experts and officers, and to incorporate feedback from this stage. Our final stage was to go live and to collect feedback from users.

Stage 1: Development of the privacy framework

The first step in the development of the Framework was to identify what were the key minimum privacy policy options relating to the collection of personal information. The work by Margaret Jackson on minimum international privacy standards in 2004 formed the basis for this stage. Four key principles were identified as being the most important: disclosure, access by the data subject, retention and security (Jackson, 2004).

Three questions about these minimum principles were developed:

  • Disclosure: Do you pass or sell personal information to other organisations?
       ▸If yes, what is the purpose for this disclosure?
  • Retention: How long do you retain collected information?
  • Security: How is the information you collect secured?

We made it a requirement in every policy that access by the data subject to amend or find out more about the data collected about them had to be provided and a contact person identified. To use our policies, organisations must allow people to update their information.

The development of the privacy framework required us to make a number of decisions about how much information we could collect from businesses to enable us to understand what personal information they collected, without asking too many questions. We were conscious that each new question or option would expand the number of policies that would be generated. We also wanted to ensure that users did not have to scroll through pages and pages of questions to generate their policy. Keeping it simple was challenging. So we chose to limit the number of options that could have been provided.

So, for example, if we had allowed users to opt to keep the personal data they collected for a month, three months, six months, 18 months or forever, then we would multiply our policies by five. So we chose to state that businesses using our policies must only store data while a person is a customer (with a default of six months), plus any period required by the law.

We also made it non-negotiable that if financial data was being collected, the business had to encrypt data. This is in keeping with the terms of use of most online credit service providers.

In step two, we developed some questions around the operation of the data protection legislation in Australia. As mentioned earlier, Australian data protection laws are complicated by the number of exceptions and exemptions contained in the Privacy Act and by the fact that the Act does not cover state or territory government agencies. This complexity has to be built into any consideration of how an organisation handles personal information. The information will be handled differently depending on the type of organisation, its size, the type of information and whether the information or organisation is exempted from the operation of the relevant legislation.

The questions that an organisation needs to consider when designing a privacy policy are as follows:

  • What type of personal information are you collecting?
    (personal, financial, medical, sensitive, social media)
  • Is your organisation part of the federal government, state or territory government or the private sector?
  • Is your annual turnover $3 million or less? (as turnover of $3 million is the trigger for whether the business is covered by the Act or not)
  • Who is your privacy contact person?
  • How do data subjects update/access their information?

In step three, we developed seven possible privacy policies that could be generated from the questions posed. An icon, a capital P with a number from one to six, designates each option. Table 1 sets out the options built into each policy.


Table 1: Options covered by Simple Privacy policies.
No information collected  Information is collectedInformation is encryptedInformation is secured
P0 Information is not passed to othersP1P2
Information passed for expected purposeP3P4
Information passed for any purposeP5P6


The P0 policy covers sites that collect no personal information at all. Many Web sites are keen to let their visitors know that they do not collect any personal information at all. On first glance, it seems strange to create a privacy policy when there is no need for privacy. The P0 (or P*) policy provides a formal way for these sites to delineate what they are doing. For example, the full policy provides contact information if someone feels that the site is actually collecting personal information.

The first two options, P1 and P2, cover the collection of personal information only. No information is passed outside the collecting organisation, for any purpose. In the case of P1, the information is encrypted. In the case of P2, the information is secured, but not encrypted.

An example of P1 might be a site that sells digital files (e.g., images or music). Financial information is collected, so the information must be encrypted. However, the digital files are downloaded directly, so there is no need to pass personal information to a third party.

An example of P2 might be a site that maintains an e-mail list. Personal information is collected, and is protected by a password. This information is not passed to any third party.

Policies P3 and P4 cover the collection of personal information that is passed on or disclosed to a third party to allow the agreed service or purchase to be provided. An example of the P3 policy might be a site that sells physical objects, and passes information to a postal company for delivery. An example of the P4 policy might be a hobby site that allows hobbyists who are close to one another to meet. Address information is provided to nearby hobbyists in accordance with the norms established by the site. Both P3 and P4 deal with activities that a user or customer would expect to be happening with their personal data.

Policies P5 and P6 cover the collection of personal information that is passed on or disclosed to a third party for any purpose at all, such as selling the information to advertisers or to data profilers. Policy P5 covers sites that encrypt their data (which is required if they collect financial information), while policy P6 covers sites that secure their information but do not encrypt it.

Step four involved the development of Short Privacy Statements containing the very minimum of information. The name of the organisation, whether the information collected is passed onto third parties or not, retention approach, how the information is secured, the right of access, the relevant jurisdiction, in this case, Australia, and the contact details of the privacy officer.

Full privacy policies were also drafted which expand each short statement. Once generated, these policies cannot be changed by businesses. If a business needs to change the type of policy they are operating under, they will need to redo the questions and generate a new policy. By keeping the process as simple as possible, we have sought to avoid the complication of maintaining a database of organisations that tracks their privacy policies.

Any changes due to changes in the legislation will be made on the Framework Web site.

Stage 2: Prototyping of privacy framework

To obtain feedback on the icons and the Simple Privacy Statements, we ran two focus groups with individuals who use the World Wide Web for purchasing and other activities.

The focus groups confirmed for us that many people do not read privacy notices and instead rely on the reputation or perceived trustworthiness of the site when providing their personal information. Many participants expressed confusion about the nature of the information collected by sites, and in particular the use of cookies and monitoring programs that enabled advertising to be tailored to individuals, drawing on personal information already provided. A number of people indicated that they curtailed their use of particular sites, or entered inaccurate information, such as age or location information, in order to circumvent tracking programs. Focus group participants said they rarely read a site’s privacy policy in order to determine how their information was used, as they found the language impenetrable and too legalistic.

The focus groups raised the issue of where the icon would be located and whether it would be repeated on each page that related to the collection of information, as well as being on the home page. They also queried whether different icons related to the collection of different types of data would appear together on the front page.

We slightly amended the simple privacy statements as a result of the focus group feedback. Generally, though, the overall response was positive with all participants agreeing that the one page statement gave them a clear understanding of how the organisation would treat their personal information once collected.

Next, we trialled the framework with a number of small organisations that covered a wide range of activities — online retailers, exempt and non-exempt small business, health providers and government — all collecting personal information online.

We met with the following types of businesses:

  1. Hospital
  2. Project Management Consultants
  3. Bookshop
  4. Research Centre
  5. Research Institute
  6. Technology Consultants
  7. Vineyard

The business managers we interviewed provided us with a range of experiences in developing and using privacy policies. We found that for some, developing a privacy policy involved researching the policies of similar businesses and organisations and adapting it to their needs. This sometimes resulted in policies that were not correct or entirely appropriate for their needs. Many had also been developed at the early stages of the business and had not been amended or changed since that time. None of the exempt small businesses were aware that they were exempt from the Privacy Act or why.

We also reviewed the current privacy policies of these organisations. Generally, we were able to point out areas that needed improvement in every policy that we reviewed. Most did not specify the period of time information was retained and did not have a policy of disposal. Some also did not provide the facility for users to access and update information.

To stimulate discussion, we showed the interviewees printouts (paper prototypes) showing how our new system might work. Paper prototyping is a recognised method used in Web development to test how users will actually use a site (Snyder, 2003). Pages are printed for each screen of the new site, showing the design, mock content (including forms) and the navigational links. Users are assisted by the researcher to view the pages, select links and fill in forms as they would do on a computer screen. The user provides feedback on why they chose to act in the manner that they did and what they are thinking about as they used the layout. The business participants were asked to simulate using the site and use paper prototypes for searching and creating a privacy policy. They were asked for their views on the system, on its strengths and weaknesses and if they would find it useful (Nielsen, 2000).

All managers welcomed the simplified privacy template and found it easy to answer the questions required to build their privacy policy.

Stage 3: Feedback from privacy officers

In Stage three, we discussed our Framework with privacy experts and asked them to work through the questions and generate a policy. Four experts tested the questions, assumptions and policies. We also gave public presentations on the Framework attended by privacy experts and discussed issues and comments raised.

Stage 4: Online evaluation

Finally, we launched Simple Privacy for a six-month trial in July 2015, to gather any feedback and questions. To promote the service, our university featured Simple Privacy in a research supplement that it published in a major national newspaper. It was included in a government guide on computer security that was distributed to small businesses. This publicity led to a small number of people contacting us with questions about the system.

We have been gratified by the reception that it has had. The comments and feedback we have received from users include simply telling us that they were using it and thanking us for the service, questions around the meaning of certain elements, and feedback when things have gone wrong.

While Simple Privacy itself has worked quite well, we have struck unexpected issues with the bureaucracy that sits around it. The project was initially developed under the auspices of the Smart Services Cooperative Research Centre. When that centre came to a close, we sought to have it hosted by our own university. However, the bureaucracy around that simple change has been overwhelming. At the moment, it is being hosted by our developer. Because of this change, the site needs a new security certificate to operate properly. Again, the bureaucracy around having our university issue a certificate seems has defeated us for the time being.

As such, the next steps are to develop an independent governance group for the service, and move it out of the university regime. That will be the work of the next stage of the project.




Our project explored the question whether it is possible to make online privacy policy development simpler for both businesses and for individuals who deal with them.

We have developed a three-layered approach to online privacy policies which provides a simple framework for businesses to use. This framework, based on the Creative Commons approach, contains icons, simple privacy statements and plain language privacy policies.

From our initial prototype testing, we knew that our simple privacy framework would be used by businesses and would make understanding how Web sites handled data collection easier for individuals. We found that there were choices that had to be made to reduce the complexity but that by doing this, it is possible to produce simple but legal policies. There are challenges, though, in ensuring that the policies are not too simple and fail to cover the requirements of the data protection principle underpinning data privacy. End of article


About the authors

Margaret Jackson is Emeritus Professor, College of Business, RMIT University in Melbourne, Australia.
Direct comments to: margaret [dot] jackson [at] rmit [dot] edu [dot] au

Jonathan O’Donnell is Senior Advisor for Research Grant Development (DSC) at RMIT University.

Joann Cattlin is Project Manager of the Innovative Learning Environments and Teacher Change (ILETC) project at the University of Melbourne.



This project was funded by the Co-operative Research Centre for Smart Services, Australia.



Marty Abrams and Malcolm Crompton, 2005. “Multi–layered privacy notices — A better way,” Privacy Law Bulletin, volume 2, number 1, pp. 1–4, at, accessed 16 June 2016.

Patricia Sánchez Abril, 2010. “Private ordering: A contractual approach to online interpersonal privacy,” Wake Forest Law Review, volume 45, pp. 689–727, and at, accessed 16 June 2016.

Australia. Office of the Privacy Commissioner, 2004. “2004 community attitudes towards privacy in Australia,” at, accessed 16 June 2016.

Kheira Bekara, Yosra Ben Mustapha and Maryline Laurent, 2010. “XPACML eXtensible privacy access control markup language,” 2010 Second International Conference on Communications and Networking, pp. 1–5.
doi:, accessed 16 June 2016.

Anne Carblanc, 1999. “Activities of the OECD 1997–2000 — Global privacy protection Builds trust in electronic commerce and global networks,” 21st International Conference on Privacy and Personal Data Protection.

Center for Information Policy Leadership (CIPL), 2015. “Multi-layered notices explained,” at, accessed 26 March 2016.

Roger Clarke, 1998. “Platform for privacy preferences: An overview,” Privacy Law & Policy Reporter, volume 5, number 2, pp. 35–39, and at, accessed 16 June 2016.

Alissa Cooper, John Morris and Erica Newland, 2010. “Privacy rulesets (editor’s draft),” World Wide Web Consortium (6 October), at, accessed 16 June 2016.

Lorrie Faith Cranor, Serge Egelman, Steve Sheng, Aleecia M. McDonald and Abdur Chowdhury, 2008. “P3P deployment on Websites,” Electronic Commerce Research and Applications, volume 7, number 3, pp. 274–293.
doi:, accessed 16 June 2016.

Lorrie Faith Cranor, Brooks Dobbs, Serge Egelman, Giles Hogben, Jack Humphrey, Marc Langheinrich, Massimo Marchiori, Martin Presler-Marshall, Joseph Reagle, Matthias Schunter, David A. Stampley and Rigo Wenning, 2006. “The platform for privacy preferences 1.1 (p3p1. 1) specification,” W3C Working Group Note (13 November), at, accessed 16 June 2016.

Creative Commons, 2016a. “About — Creative Commons,” at, accessed 9 March 2016.

Creative Commons, 2016b. “About the licenses — Creative Commons,” at, accessed 9 March 2016.

Creative Commons, 2016c. “Choose a license — Creative Commons,” at, accessed 9 March 2016.

Bernhard Debatin, Jennette P. Lovejoy, Ann-Kathrin Horn and Brittany N. Hughes, 2009. “Facebook and online privacy: Attitudes, behaviors, and unintended consequences,” Journal of Computer-Mediated Communication, volume 15, number 1, pp. 83–108.
doi:, accessed 16 June 2016.

Direct Marketing Association (DMA), 2016. “Privacy policy generators,” at, accessed 9 March 2016.

Liju Dong, Yi Mu, Willy Susilo, Peishun Wang and Jun Yan, 2011. “A privacy policy framework for service aggregation with P3P,” ICIW 2011: Sixth International Conference on Internet and Web Applications and Services, pp. 171–177, and at, accessed 16 June 2016.

Benjamin Edelman, 2011. “Adverse selection in online ‘trust’ certifications and search results,” Electronic Commerce Research and Applications, volume 10, number 1, pp. 17–25.
doi:, accessed 16 June 2016.

Niva Elkin-Koren, 2006. “Creative Commons: A skeptical view of a worthy pursuit,” In: Lucie Guibault and P. Bernt Hugenholtz (editors). The future of the public domain: Identifying the commons in information law. Alphen dan den Rijn, the Netherlands: Kluwer Law International, pp. 325–345.

European Commission, 2010. “A comprehensive approach on personal data protection in the European Union,” COM(2010) 609 final, at, accessed 16 June 2016.

Lynn M. Forsythe and Deborah J. Kemp. 2008. “Creative Commons: For the common good?” University of La Verne Law Review, volume 30, number 2, pp. 346–369.

Kambiz Ghazinour and Ken Barker. 2011. “Capturing P3P semantics using an enforceable lattice-based structure,” PAIS ’11: Proceedings of the Fourth International Workshop on Privacy and Anonymity in the Information Society, article number 4.
doi:, accessed 16 June 2016.

Graham Greenleaf, 2012. “Independence of data privacy authorities (Part I): International standards,” Computer Law & Security Review, volume 28, number 1, pp. 3–13.
doi:, accessed 16 June 2016.

Herkko Hietanen, 2008. “The pursuit of efficient copyright licensing — How some rights reserved attempts to solve the problems of all rights reserved,” dissertation at the Lappeenranta University of Technology, at, accessed 16 June 2016.

Information Commissioner’s Office, 2016. “Privacy seals” (22 January), at, accessed 16 June 2016.

Information Commissioner’s Office, 2014. “Privacy seals: Draft framework criteria” (3 October), at, accessed 16 June 2016.

Margaret Jackson, 2004. “A data protection framework for technology development,” Telecommunication Journal of Australia, volume 54, number 2, pp. 43–51; version at, accessed 16 June 2016.

Margaret Jackson, Julian Ligertwood, Jonathan O’Donnell and Marita Shelly, 2006. “Small business: Issues of identity management, privacy and security,” paper presented at AOIR International Conference, Brisbane.

Michael D. Kirby, 1986. “Access to information and privacy: The ten information commandments,” Government Information Quarterly, volume 3, number 4, pp. 333–344.
doi:, accessed 16 June 2016.

Amanda Lenhart and Mary Madden, 2007. “Teens, privacy and online social networks,” Pew Research Center (18 April), at, accessed 16 June 2016.

Lawrence Lessig, 2003. “The Creative Commons,” Florida Law Review, volume 55, pp. 763–777.

Mary Madden and Aaron Smith, 2010. “Reputation management and social media,” Pew Research Center (26 May), at, accessed 16 June 2016.

Aleecia M. McDonald and Lorrie Faith Cranor, 2009. “The cost of reading privacy policies,” I/S: A Journal of Law and Policy for the Information Society, volume 4, number 3, pp. 540–565, and at, accessed 16 June 2016.

Trevor Moores, 2005. “Do consumers understand the role of privacy seals in e-commerce?” Communications of the ACM, volume 48, number 3, pp. 86–91.
doi:, accessed 16 June 2016.

Jakob Nielsen, 2000. “Why you only need to test with 5 users,” Alertbox (19 March), at, accessed 16 June 2016.

Organisation for Economic Co-operation and Development (OECD), 2016. “OECD privacy statement generator,” at, accessed 9 March 2016.

OECD Working Party on Information Security and Privacy, 2006. “Making privacy notices simple,” OECD Digital Economy Papers, number 120, at, accessed 16 June 2016.
doi:, accessed 16 June 2016.

Charles Raab and Bert-Jaap Koops. 2009. “Privacy actors, performances and the future of privacy protection,” In: Serge Gutwirth, Yves Poullet, Paul De Hert, Cécile de Terwangne and Sjaak Nouwt (editors). Reinventing data protection? Rotterdam: Springer Netherlands, pp. 207–221.
doi:, accessed 16 June 2016.

Kate Raynes–Goldie, 2010. “Aliases, creeping, and wall cleaning: Understanding privacy in the age of Facebook,” First Monday, volume 15, number 1, at, accessed 16 June 2016.

Marita Shelly and Margaret Jackson, 2009. “Doing business with consumers online: Privacy, security and the law,” International Journal of Law and Information Technology, volume 17, number 2, pp. 180–205.
doi:, accessed 16 June 2016.

Ravi Inder Singh, Manasa Sumeeth and James Miller, 2011. “A user-centric evaluation of the readability of privacy policies in popular Web sites,” Information Systems Frontiers, volume 13, number 4, pp. 501–514.
doi:, accessed 16 June 2016.

Carolyn Snyder, 2003. Paper prototyping: The fast and easy way to design and refine user interfaces. San Diego, Calif.: Morgan Kaufmann.

Wallis Consulting Group. 2007. “Community attitudes to privacy 2007,” at, accessed 16 June 2016.

World Wide Web Consortium, 2011. “P3P: The platform for privacy preferences,” at, accessed 20 April 2016.


Editorial history

Received 1 April 2016; accepted 16 June 2016.

Creative Commons License
This paper is licensed under a Creative Commons Attribution 4.0 International License.

Simple online privacy for Australia
by Margaret Jackson, Jonathan O’Donnell, and Joann Cattlin.
First Monday, Volume 21, Number 7 - 4 July 2016

A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2020. ISSN 1396-0466.