Privacy Principles and Rules - A legal analysis

Privacy Principles and Rules - A legal analysis

By John Borking.

Defining Privacy

Identity

Privacy assumes an identity of the individual. Identity is people’s source of meaning and experience. Identity requires self-knowledge in order to make a distinction between self and others and to be known as such in specific ways by others, self-discipline i.e. controlling our thinking and our feelings, introspection and subsequently expression of what has been found in oneself. It creates a human being as a self-sufficient (autarkic) unity and leads to the most central value in our moral and political life namely freedom to control and to understand who we are and to know what our real self is. People have a psychological need for some control over how they are seen by the surrounding world. Castells describes identity as people’s source of meaning and experience, constructed through a process of individuation and that is creates a primary identity (that is an identity that frames the others) that is self-sustaining across time and space. Society can legitimize an identity and this generates a civil society

Legally speaking the identity of the individual is acknowledged when he or she is registered in the birth register. All data concerning this identity will be considered as belonging to that individual. Individual Privacy protects our default position in society of anonymity, our secrets, that we share with no one, our need to be apart from all human beings or our desire of ‘being cut off’ by wish or circumstances from one's usual associates and it limits the power of the state and private organizations to intrude into our autonomy.

Definition of privacy

In 1890 Louis Brandeis articulated privacy as the individual’s right to be left alone . He pointed out that privacy is essential to protect the personality and the individual’s independence, dignity and integrity. His concept of privacy infringement as tort became overtime in the US a part of its common law.

Many definitions of privacy exist. In 1990 the Calcutt Committee in the UK stated that they haven’t found a wholly satisfactory definition of privacy. Of all the human rights privacy is perhaps the most difficult to define and the term privacy has been considered as problematic due to the lack of consensus about its scope and delineation. Privacy has been defined by Westin as: “the claim of individuals...to determine for themselves when, how and to what extent information about them is communicated to others” . From the above mentioned definitions follows that privacy has two distinct characteristics: 1) the right to be left alone and 2) the right to decide oneself what to reveal about oneself. A suggested definition for privacy (a combination of the definitions of Brandeis and Westin) in this research project is:

The claim of individuals to be left alone, free from surveillance or interference from other individuals, organizations or the state and to determine for themselves when, how and to what extent information about them is communicated to others.

Informational privacy

Four kinds of privacy have been discerned: Bodily privacy, Privacy of communications, Territorial privacy and informational privacy. This research project deals with Informational privacy. Informational privacy involves the establishment of rules governing the collection and handling of personal data such as credit information, and medical and governmental records. It is also known as “data protection” . If we look at the world map many differences with regard to the protection of privacy is noticeable. There are countries with a high protection of privacy; countries that protect privacy only partly and a lot of countries were privacy is still terra incognita (about 100 states primarily in Africa and Asia) .

In those countries that have recognized privacy as a human right, information privacy at least comprise the following four elements: A. Principle of Existence of Privacy A data subject possesses an identity and other pertinent and variable information that this person may consider to belong to his or her privacy domain. This collection of information is called personal data. B. Principle of Withholding The data subject has the right and should be equipped with the ability to withhold some or all of his personal data to other persons and organizations at this person's private choice. C. Principle of Trusted Usage The person or organization that receives the personal data and stores it is called the controller or collector. This collector of information has the obligation to keep to the constraints on dissemination and processing of personal data as for example either stated in the EC privacy directives and Norwegian legislation, or when such piece of law doesn’t exist according to the privacy preferences of the data subject. Furthermore, this collector has the obligation to inform the person involved of its possession of personal data and to provide the opportunity for change. If so permitted, the collector may copy the personal data to one or more processors for further processing. D. Principle of Controlled Dissemination The data subject has the right to disclose some or all of his or her personal data to other persons and organizations, the collectors of the personal data, at this data subject's own choice. This data subject may issue constraints on the dissemination of the personal data to one or more processors for further processing of this data and has the right to change the personal data, to extend and restrict it, to withdraw this information and to change the constraints.

Personal data

So, although it is a situation that is wanted by an individual, in fact it comprises a set of rules of conduct between the individual person and the person's environment with respect to the manipulation of personal data. Personal data can be defined as the collection of all data that is or can be related to the individual. This includes factual data including its identification, physical data, behavioral data, social data, financial data and what more data can be at stake.

==Privacy facilitation principles The four elements of informational privacy mentioned under 3.1.3 can be elaborated in the direction of information processing, to produce rules that will govern the rules for the handling of personal data in more detail . These rules will be called the Privacy Facilitation Principles, although it has to be borne in mind that based on the existing legal system and culture in other privacy regimes another set of privacy facilitation principles may prevail. The privacy facilitation principles are: Intention and Notification, Transparency, Legitimate Ground for Processing, Finality and Purpose Limitation Principle, Data Quality, Data subject’s Right, Security, Processing by a Processor, Transfer of Personal Data Outside the EU

The Privacy facilitation principles explained

  1. Intention and Notification: The processing of personal data must be reported in advance to the Data Protection Authority (where applicable) or an organization’s privacy officer (where applicable), unless the processing system in question has been exempted from notification.
  1. Transparency: The person involved must be aware of who is processing his personal data and for what purpose. Thus any collection of personal data implies prior supply of certain information to the individual concerned. Two situations may be distinguished: (i) the personal data are collected from the data subject, and (ii) the personal data are collected in another manner.

In the case of situation (i), the data subject must be provided with at least the following information before the data are collected: - the identity of the collector/controller (which includes the name as well as the physical and electronic address); and - the intended purpose(s) of the processing. It also has to be determined whether further information (e.g. the recipients of the data, whether replies to the questions are obligatory and the possible consequences of failure to reply, and the existence of the right of access to and the right to rectify the data) has to be provided to guarantee fair processing. In case of situation (ii), the data subject must get information at the latest at the moment where personal data are recorded or first disclosed to a third party. It should be noted that the requirement to provide information does not apply where it would be impossible or would involve a disproportionate effort, or if recording or disclosure of data is expressly laid down by law. Any other information required by law (e.g. that a disclosure of personal data to third parties is foreseen) has to be provided in a comprehensible form.

  1. Legitimate Ground for Processing: The processing of personal data must be based on a foundation permitted by national legislation, such as consent, contract or some other legal obligation. For special data, such as health data, stricter limits prevail. This means that for each processing of personal data – collection, recording, storage, adaptation, alteration, retrieval, consultation, disclosure, dissemination, etc. - the collector/controller has to verify if the processing falls under one of the criteria for making data processing legitimate.
  1. Consent: The default position is that the right to privacy prohibits any processing of personal data. To make data processing legitimate legal criteria were set out in article 7 of the Directive 95/46/EC (DPD) . It is one of the most important privacy facilitation principles.

Therefore consent management is also the one of the most important design principles. According to the DPD, a data subject's consent means “any freely given specific and informed indication by which the data subject signifies his agreement to personal data relating to him being processed”. The data subject must be given the opportunity to give his consent in a clear, unambiguous way. Many national privacy laws use other adjectives describing consent: “freely”, “specific”, “informed”, “unambiguously” and “explicit”. The importance of consent of the individual implies that interface elements for making privacy decisions (such as giving consent) should be prominent and obvious. It is crucial that users understand when they are entering into a contract for goods or services and the implications of that contract . They should also be aware of and understand the special cases where their personal data may be processed without their consent or without a contract. Thus there are three aspects to be taken into account concerning consent. First the data subject must freely express (without being put under pressure) his or her wishes. If there is no free will then any consent given will be void. Secondly the consent must be aimed at a specified purpose of processing and specific data. A general-purpose consent is not acceptable under the DPD. Thirdly the consent must be free of a double meaning, must be clear and must be certain. An affirmative answer can be given to the legal question whether the ISA can give unambiguous consent required by the law for legitimate processing of the PII on behalf of the user. More complicated is the fact that the DPD prohibits the processing of sensitive or special categories of data like data revealing race or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, health or sexual life. This prohibition on processing those data is lifted when processing occurs after explicit consent of the data subject. The required consent may be given in an electronic form combined with a biometric string of the user that the non repudiation and explicitness is beyond doubt, provided that an accompanying timestamp with the instruction of explicit consent is recent, i.e. not older than 24 hours. Article 8 paragraph 2 of the DPD defines exemptions to the prohibition like situations in which data are processed for medical care. By way of an example, the French data protection law requires “express consent” for the processing of sensitive data, and that has been interpreted as requiring that the consent be expressed in writing. The French Commission Nationale de l’Information et des Libertes (CNIL) ) Data Protection Authority) accepted that, with regard to processing of sensitive data on the Internet, one might substitute a “double click” for this consent (i.e. one “click” to confirm that one is aware of the proposed processing, and a further one to “expressly” consent to it

  1. Finality and Purpose Limitation Principle: Personal data may only be collected for specific, explicit and legitimate purposes and not further processed in a way that is incompatible with those purposes. In other words, in lack of a legitimate basis for processing personal data it may not be collected/processed and the individual concerned must remain anonymous. Users have to be aware that they have the right to object to processing of their personal data for the purposes of direct marketing. Actually, the law, like the Directive 95/46/EC, protects them to an even further extent: they must opt-in to processing for such purposes; otherwise their personal data may not be used. If the use of direct marketing techniques is foreseen, the possibility to opt-in should be offered during registration. A “just-in-time-click-through agreement” can be used for final acceptance of such use .
  1. Data Minimisation/Data Avoidance: The processing of personal data should be limited to data that are adequate, relevant and not excessive in relation to the purpose for which they are collected/processed. Data should only be kept in a form that permits identification of the data subject for no longer that is necessary for the purposes for which the data were collected of for which they are further processed.

As far as electronic communications are concerned, of particular relevance is Article 14(3) of the Directive 2002/58/EC, underlining that where required, measures may be adopted to ensure that terminal equipment is constructed in a way that is compatible with the right of users to protect and control the use of their personal data. Recital 30 states that systems for the provision of electronic communications networks and services should be designed to limit the amount of personal data necessary to a strict minimum.

  1. Data Quality: The personal data must be correct, accurate, sufficient, to the point and not excessive in relation to the purpose in question. Also the following issues have to be taken into account: 1. authorization for the inspection for data input; 2. storage terms; 3. periodical clearing; 4. information on the disclosure of corrected data to third parties to whom these data have been previously disclosed; and 5. final inspection of automated decisions .
  1. Data subject’s rights (see also Transparency): The data subjects involved have the right to access and to correct their data. The right of access to and the right to rectify the data are for guaranteeing fair processing. A data subject has also the right to object, on compelling and legitimate grounds relating to his particular situation, to the processing of data relating to him. This right to object must at least cover the cases where processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority and where processing is necessary for the purposes of the legitimate interests pursued by the controller . A special legal regime applies to the processing of personal data for the purposes of direct marketing, and in particular to unsolicited commercial communications sent electronically such as “spam’. Of course, adequate security measures should be taken in order to guarantee that only the data subject has on-line access to information concerning him.

In order to be able to exercise their rights, users must know what rights they have and understand them. The human computer interface should provide obvious tools for exercising the data subject's rights.

 

  1. Security: Providing appropriate security for personal data held within ICT-systems is one of the cornerstones of the informational privacy protection. Measures of technical and organizational nature suitable and proportional to the sensitivity of the personal data and the nature of possible risks have to be taken to avoid potential harm should the PII be misused or disclosed in an unauthorized matter. Such suitable measures of technical and organizational nature make up the necessary tailpiece of lawful processing.

Whilst the privacy facilitation principles express the general measures to be taken to protect privacy, they are not the prime candidates for the start of a threat identification process. The original threats have to be found before an information system has been built. Therefore a privacy threat analysis has to be executed .

  1. Processing by a Processor: If processing is outsourced to a processor (acting on behalf of the controller/collector), it must be ensured that a contract (or another legal act) in writing (or another equivalent form) must be concluded between the two parties, binding the processor to the controller and stipulating, in particular, that:
    1. The processor shall only act on instructions from the controller; and
    2. The obligations concerning security shall also be applicable and be binding on the processor irrespective of the country in which the data processing is taking place.
  1. Transfer of Personal Data Outside the EU: In principle, the transfer of personal data to a country outside the EU is permitted only if that country offers adequate protection. Personal data may flow freely between all the 25 Member States of the EU and the three European Economic Area (EEA) states (Norway, Lichtenstein and Iceland). In addition, the Commission may, by means of a decision, determine that a third country ensures an adequate level of protection. The effect of such a decision is that personal data can flow from the 25 EU Member States plus EEA to that third country without any further safeguard being necessary. The Commission has so far recognized Switzerland, Canada, Argentina, Guernsey, Isle of Man, the US Department of Commerce's Safe harbor Privacy Principles, and the transfer of Air Passenger Name Record to the United States' Bureau of Customs and Border Protection as providing adequate protection. However on 30 May 2006 the EU Court of Justice in Luxembourg ruled in the joined cases C-317/04 and 318/04 that the decision of the Commission on adequacy doesn’t fall within the scope of the DPD and therefore the Court annulled the decision on adequacy. The result of this decision is that Passengers Name Records cannot be transferred after 30 September 2006 to the U.S. On 4 October an agreement has been signed between the EU and the US concerning a provisional transfer and use of PNR by the US department of homeland security .

Four Requirements from the Directive 2002/58/EC as Data subject’s rights

a. Confidentiality of Communications According to Article 5 of this Directive the confidentiality of communications by means of public communications network and publicly available electronic communication services (and the actual traffic data) is ensured through national legislation in a way analogous to the centuries-old secrecy of correspondence . In particular, listening, tapping, storage or other kinds of interception or surveillance of communications and the related traffic data by persons other than users, without the consent of the users and except when legally authorized to do so is prohibited. However, legal authorization for the monitoring of electronic communications is possible when it constitutes a necessary, appropriate and proportionate measure within a democratic society to safeguard national security, defense, public security, and the prevention, investigation, detection and prosecution of criminal offences or of unauthorized use of the communications system.

b. Traffic Data Article 6 requires that traffic data (i.e. any data processed for the purpose of the conveyance of a communication) relating to subscribers and users processed and stored by the provider of a public communications network or publicly available electronic communications service must be erased or made anonymous when it is no longer needed for the purpose of the transmission of a communication. Traffic data necessary for the purposes of subscriber billing and interconnection payments may be processed but such processing is permissible only up to the end of the period during which the bill may lawfully be challenged or payment pursued.

c. Location Data Article 9 defines location data as data processed in an electronic communications network, indicating the geographic position of the terminal equipment of a user. Such data may only be processed when they are made anonymous, or with the consent of the users or subscribers to the extent and for the duration necessary for the provision of a value added service. The service provider must inform the users or subscribers, prior to obtaining their consent, of the type of location data other than traffic data which will be processed, of the purposes and duration of the processing and whether the data will be transmitted to a third party for the purpose of providing the value added service. Where consent of the users or subscribers has been obtained for the processing of location data other than traffic data, the user or subscriber must continue to have the possibility, using a simple means and free of charge, of temporarily refusing the processing of such data for each connection to the network or for each transmission of a communication.

d. Unsolicited Communication Article 13 makes clear that the use of electronic mail for the purposes of direct marketing is only allowed in respect of subscribers who have given their prior consent (opt-in). As an exception on this general rule, it remains possible for merchants to send electronic mail to their own customers for direct marketing of similar products or services provided that customers clearly and distinctly are given the opportunity to object. Other types of unsolicited communications (than electronic mail – for instance, SMS) for purposes of direct marketing are subject either to an opt-in or opt-out system (at the discretion of the Member States).

Legislation

Privacy is a human right

Due to the horrors of the Second World War in 1948 privacy (respect for someone’s private life) has been recognized internationally as a human right in article 12 of the Universal Declaration of Human Rights by the General Assembly Resolution 217A (III). In 1950 the European Convention for the Protection of Human Rights and Fundamental Freedoms protects in article 8 privacy as a fundamental right of the individual “for respect for his private and family life, his home and correspondence”. No interference by the public authority has been allowed “unless interference is required by the law, national security, public safety, economic well-being of the country, prevention of disorder or crime, protection of health, of morals or the protection of rights and freedoms of others”.

The European Commission of Human Rights and the European Court of Human Rights (ECHR) have many times interpreted expansively the protection of article 8 and interpreted the restrictions narrowly. Unauthorized processing of personal data has been considered by the ECHR as an infringement of the right on the protection of privacy.

Privacy Facilitation principles in the legislation

A further major step towards privacy protection in Europe was in 1980 the release of 'Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data by the Organization for Economic Co-operation and Development (OECD) and in 1981 the adoption of Convention 108 for the Protection of Individuals with regard to Automatic Processing of Personal Data by the Council of Europe. All these legal instruments have incorporated more or less privacy facilitation principles and subsequently these principles have been implemented into the EU directives and national legislation that follows the EU directives on privacy protection.

In the OECD Guidelines the following principles have been formulated: a. Collection Limitation Principle, b. Data Quality Principle, c. Purpose Specification Principle, d. Use Limitation Principle, e. Security Safeguards Principle, f. Openness Principle, g. Individual Participation Principle, and h. Accountability Principle.

Today in the EU, informational privacy protection for individuals is articulated by means of different EU Directives namely 95/46/EC on the protection of individuals with regard to the processing of personal data (Data Protection Directive , hereafter DPD), 1999 Digital Signature Directive 99/93/EC and the Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications (hereafter DPEC). The EU Commission always sees the directives 95/46/EC and 2002/58/EC as one piece of legislation for the protection of privacy online and offline. This kind of legislation defines a set of rights concerning personal data accruing to individuals irrespective of sector of application, and creates obligations concerning the processing of data by third parties. As a consequence of the requirement in article 25 of the Directive 95/46/EC for adequate privacy protection of personal data when transferred outside the EU, the Safe Harbor Agreement has been concluded on 26 July 2000 between EU and US. Organizations must comply with the seven safe harbor principles before they can process personal data transferred from the EU. These principles are: Notice, Choice, Onward transfer, Access, Security, Data integrity and Enforcement . Since 9/11 more than ever at the heart of the debate is the balance between human rights and civil liberties, including privacy, and the intrusion on these rights in the name of public safety . Because of the war against terrorism the Directive 2006/24/EC (hereafter DRD) on the retention of data processed in connection with the provision of public electronic communication services has been adopted for the harmonization of the legislation of the member states. According to the recitals of the Directive one of the compelling reasons for this Directive is that because of the significant growth of the possibilities of electronic communications, data relating to the use of electronic communications is particularly important and therefore a valuable tool in the prevention, investigation, detection and prosecution of crime and criminal offences, in particular against organized crime. The Directive wants to harmonize the obligations of the providers of publicly available electronic communications services or of public communications networks with respect to the retention of certain data which are generated or processed by them, in order to ensure that the data are available for the purpose of the investigation, detection and prosecution of serious crime, as defined by each Member State in its national law . The data that has to be retained are :

a) Data necessary to trace and identify the source of a communication;

b) Data necessary to identify the destination of a communication;

c) Data necessary to identify the date, time and duration of a communication;

d) Data necessary to identify the type of communication;

e) Data necessary to identify users’ communication equipment or what purports to be their equipment;

f) Data necessary to identify the location of mobile equipment.

g) Data necessary to identify the date, time and duration of a communication;

h) Data concerning […] Internet e-mail: the date and time of the log-in and log-off of the Internet e-mail service, based on a certain time zone.

The retention period of these data is not less than six months and not more than two years from the date of the communication . Each Member State has the obligation to transpose this Directive into national legislation not later then 15 September 2007, but may postpone the application of this Directive to the retention of communications data relating to Internet Access, Internet telephony and Internet e-mail until 15 March 2009 .

 

Norwegian legislation

The transfer of personal data to a country outside the EU is permitted only if that country offers adequate protection Personal data may flow freely between all the 25 Member States of the EU and the three European Economic Area (EEA) states (Norway, Lichtenstein and Iceland). Norway is a party to the 1992 Agreement on the European Economic Area (EEA). As such, it is required to comply with the EU Directives before it is formally incorporated into the EEA. The Personal Data Registers Act of 2000 was approved on April 14, 2000. It was designed to update Norwegian law and closely follows the EU Directive, even though Norway is not a member of the EU. New in relation to the EU Directive is that the act imposes a duty of informing the subject when, on the basis of a personal profile, either the data subject is approached or contacted, or a decision directed at the data subject is made . In such case, the data subject must be automatically informed of the data controller’s identity, the data constituting the profile and the source of these data. The Electronic Communications Act of 2003 and its accompanying regulations implement the requirements of the EU Directive 2002/58/EC. In the law and its sequels it is stipulated that all electronic communication providers must keep records of all their end users. The Data Inspectorate (Datatilsynet) is an independent administration body set up under the Ministry of Justice in 1980. The Inspectorate accepts applications for licenses for data registers and evaluates the licenses, enforces the privacy laws and regulations, and provides information. The Inspectorate can conduct inspections and impose sanctions. Decisions of the Inspectorate can be appealed to the Ministry of Justice.

 

Privacy Principles for IT systems

While designing PETweb applications and prototypes, the privacy facilitation principles discussed under 3.1 have to be kept in mind. These principles are derived from the relevant of EU laws (in particular, the Data Protection Directive 95/46/EC), as well as the OEDC guidelines on privacy and Norwegian data protection legislation. The default position of the PETweb design starts from maximum privacy. This means that interactions are a priori anonymous or pseudonymous. Privacy and anonymity is also ensured with respect to system operators, unless the law doesn’t allow this

a. Principles concerning the fundamental design of products and applications :

  1. Data minimization (maximum anonymity and early erasure of data)
  2. Transparency of processing
  3. Security

b. Principles concerning the lawfulness of processing:

  1. Legality (e.g. consent)
  2. Special categories of personal data
  3. Finality and purpose limitation
  4. Data quality

c. Rights of the data subject:

  1. Information requirements
  2. Access, correction, erasure, blocking
  3. Objection to processing

d. Data traffic with third countries

e. Notification requirements

f. Processing by a processor – responsibility and control

g. Requirements resulting from the Directive 2002/58/EC/ the Norwegian legislation i.e. Security, Confidentiality of (tele)communications, Traffic data, Location data, Unsolicited communications (spam)