• Specific Year
    Any

Bygrave, Lee A --- "Core principles of data protection" [2001] PrivLawPRpr 9; (2001) 7(9) Privacy Law and Policy Reporter 169

Core principles of data protection

Lee A Bygrave

This is the second of a series of articles entitled ‘An international data protection stocktake @ 2000’, presenting a transnational perspective on the central features of data protection laws at the start of the new millennium. Part 1, ‘Regulatory trends’, appeared in (2000) 6 (8) PLPR 129 — General Editor.

The core principles applied by the data protection laws of many jurisdictions to the processing of personal data can be categorised under eight headings: fair and lawful processing; minimality; purpose specification; information quality; data subject participation and control; disclosure limitation; information security; and sensitivity. As we shall see, these categories are not always hard and fast; considerable overlap exists between them. Further, each of them is, in reality, a constellation of multiple principles.

This article analyses the constituent elements of these principles and discusses the main similarities and differences in their formal manifestation in the various data protection instruments. Detailed analysis of the scope and content of the principles and of the range of legal exemptions to their implementation is beyond the scope of the article.

The principles are primarily abstractions which denote the basic thrust of a set of legal rules. At the same time, they have a normative force of their own. This force is achieved in several ways. First, the principles (or a selection of them) have been expressly incorporated into certain data protection laws as fully fledged legal rules in their own right (though not always using exactly the same formulations as given in this article). Second, the principles function as standards which guide the balancing of interests by, for example, data protection authorities in the exercise of their discretionary powers. Finally (and closely related to the latter function), the principles help to shape the drafting of new data protection laws. This is most obviously exemplified by the considerable influence the 1980 OECD Data Protection Guidelines (OECD Guidelines) have had on the drafting of the legislation of certain OECD member states, particularly Australia and New Zealand.

Fair and lawful processing

The primary principle of data protection laws is that personal data should be ‘processed fairly and lawfully’.[1] This principle is ‘primary’ because, as shown below, it both embraces and generates the other core principles of data protection laws. Concomitantly, the twin criteria of fairness and lawfulness are implicit in all of these principles, even if they are expressly linked in some instruments only to the means for collection of personal data[2] or not specifically mentioned at all.[3]

Of the two notions ‘fairly’ and ‘lawfully’, the latter is relatively self-explanatory. The notion of fairness is less obvious in meaning but potentially broader. An exhaustive explication of the notion of fairness probably can not be achieved in the abstract. Moreover, general agreement on what is fair will inevitably change over time. Nevertheless, at a very general level, the notion of fairness undoubtedly means that, in striving to achieve their data processing goals, data controllers must take account of the interests and reasonable expectations of data subjects; controllers cannot ride roughshod over these. This means that the collection and further processing of personal data must be carried out in a manner that does not, in the circumstances, intrude unreasonably upon the data subjects’ privacy nor interfere unreasonably with their autonomy and integrity. In other words, fairness requires balance and proportion. These requirements are applicable not just at the level of individual data processing operations; they are equally applicable to the way in which the information systems supporting such operations are designed and structured.

In light of these requirements, fairness also implies that a person is not unduly pressured into supplying data on him or herself to a data controller or accepting that the data be used by the latter for particular purposes. Arguably, fairness therefore implies a certain protection from abuse by data controllers of their monopoly position. While very few data protection instruments expressly address the latter issue,[4] some protection from abuse of monopoly can be read into the relatively common provisions on data subject consent, particularly the requirement that such consent be ‘freely given’.[5]

The notion of fairness further implies that the processing of personal data be evident to the data subject.[6] Fairness not only militates against surreptitious collection and further processing of personal data; it also militates against deception of the data subject as to the nature and purposes of the data processing.[7] Arguably, another requirement flowing from the link between fairness and transparency is that, as a point of departure, personal data shall be collected directly from the data subject, not from third parties. This requirement is expressly laid down in some but not the majority of data protection instruments.[8]

As mentioned above, fairness implies that data controllers must take some account of the reasonable expectations of data subjects. This has direct consequences for the purposes for which data may be processed. It helps to provide the ground rules for the purpose specification principle (dealt with in more detail below) and sets limits on the secondary purposes to which personal data may be put. More specifically, it means that when personal data obtained for one purpose are subsequently used for another purpose which the data subject would not reasonably anticipate, then the data controller may have to obtain the data subject’s positive consent to the new use.[9]

Minimality

A second core principle of data protection laws is that there should be restrictions on the amount of personal data collected; the amount of data collection should be limited to what is necessary to achieve the purpose(s) for which the data are gathered and processed. This principle is summed up here as ‘minimality’, though it could just as well be summed up in terms of ‘necessity’ or ‘non-excessiveness’. In some data protection instruments, the principle is described as ‘proportionality’.[10]

As with the principle of fair and lawful processing, the principle of minimality is manifest in a variety of provisions. It is most obviously manifest in provisions along the lines of art 6(1)(c) of the EC Directive on Data Protection which stipulates that personal data must be ‘relevant and not excessive in relation to the purposes for which they are collected and/or further processed’. It is also manifest in provisions such as art 6(1)(e) of the Directive, which requires personal data to be erased or anonymised once they are no longer required for the purposes for which they have been kept. The minimality principle is further manifest in the Directive’s basic regulatory premise — embodied in arts 7-8 — which is that the processing of personal data is prohibited unless it is necessary for achieving certain specified goals.

The minimality principle does not shine so clearly or broadly in all data protection instruments as it does in the Directive. For instance, the 1990 UN Data Protection Guidelines (UN Guidelines) and the OECD Guidelines omit an express requirement of minimality at the stage of data collection (though such a requirement can arguably be read into the more general criterion of fairness, as described above). The OECD Guidelines also omit a specific provision on the destruction or anonymisation of personal data after a certain period. Again, though, erasure or anonymisation may be required pursuant to other provisions, such as those setting out the principle of ‘purpose specification’ (see below).[11] Most (but not all) [12] national laws make specific provision for the erasure of personal data once the data are no longer required.

Rules encouraging transactional anonymity are also direct manifestations of the minimality principle. Currently, very few data protection laws contain rules expressly mandating or encouraging transactional anonymity.[13] However, it is arguable that such requirements may be read into the more commonly found provisions (described above) in which the minimality principle is manifest, particularly when these provisions are considered as a totality.

Purpose specification

Another core principle of data protection laws is that personal data should be collected for specified, lawful or legitimate purposes and not subsequently be processed in ways that are incompatible with those purposes. This norm is often termed the principle of ‘purpose specification’.[14] Sometimes the terms ‘purpose finality’ or ‘purpose limitation’ are employed instead.

The principle has three separate components, each of which may be regarded as a principle in itself:

(1) the purposes for which data are collected should be specified/defined;



(2) these purposes should be lawful/legitimate; and



(3) the purposes for which the data are further processed should not be incompatible with the purposes for which the data are first collected.

The term ‘purpose specification’ denotes the first listed principle more aptly than the latter two. Nevertheless, the notion of purpose specification is used here to cover all three principles.

The requirement for purpose specification is prominent in all of the main international data protection instruments.[15] It is also prominent in most (but not all)[16] of the national laws and/or in administrative practice pursuant to them.[17] Some laws stipulate that the purposes for which data are processed shall be ‘lawful’.[18] Other laws, such as the EC Directive and Council of Europe Convention on Data Protection (the CoE Convention), stipulate that such purposes shall be ‘legitimate’.

Fairly solid grounds exist for arguing that the notion of ‘legitimacy’ carries the criterion of social acceptability — personal data should only be processed for purposes that do not run counter to predominant social mores. In other words, the purpose specification principle, insofar as it uses the legitimacy criterion, can arguably be said to harbour a ‘social justification principle’ similar to that proposed by, inter alia, the NSW Privacy Committee and Michael Kirby.[19] Nevertheless, the question remains of how such mores are to be defined. Are they to be defined in terms of procedural norms which hinge on a criterion of lawfulness (for example, that the purposes for which personal data are processed should be compatible with or fall naturally within the ordinary and lawful ambit of the particular data controller’s activities)? Or do they also embrace more than a lawfulness criterion (for example, that the data controller’s activities are socially desirable in the sense that they promote or do not detract from some generally valued state of affairs constituted by, say, a particular balance between privacy related interests and economic interests)?

The bulk of data protection instruments seem prima facie to comprehend legitimacy in terms of procedural norms hinging on a criterion of lawfulness; very few expressly operate or have operated with a broader criterion of social justification.[20] Nevertheless, the discretionary powers given by some laws to national data protection authorities have enabled them to apply a relatively wide ranging test of social justification, particularly in connection with the licensing of certain data processing operations.[21] While this ability is being reduced in line with reductions in the scope of licensing schemes, it will not disappear completely.[22]

Information quality

The principle of information quality stipulates that personal data should be valid and accurate with respect to what they are intended to describe, and relevant and complete with respect to the purposes for which they are intended to be processed. All data protection laws contain rules directly embodying the principle, but they vary considerably in their wording, scope and stringency.

Regarding the first element of the principle (the validity of data), data protection laws use a variety of terms to describe the stipulated data quality. Article 5(d) of the CoE Convention and art 6(1)(d) of the EC Directive state that personal data shall be ‘accurate and, where necessary, kept up to date’.[23] The equivalent provisions of some other data protection instruments refer only to a criterion of accuracy/correctness,[24] while still others supplement the latter with other criteria, such as completeness.[25]

With regard to the principle’s second element the EC Directive formulates this as a requirement that personal data are ‘adequate, relevant and not excessive in relation to the purposes for which they are collected and/or further processed’ (art 6(1)(c)).[26] Some data protection instruments refer to the criteria of relevance, accuracy and completeness but do not refer to non-excessiveness.[27]

Finally, variation exists in terms of the stringency with which data protection instruments require checks on the validity of personal data. The standard set by the EC Directive, for example, is in terms of ‘every reasonable step must be taken’ (art 6(1)(d)). By contrast, the UN Guidelines emphasise a duty to carry out ‘regular checks’ (principle 2). Many other data protection instruments, including the OECD Guidelines and CoE Convention, do not explicitly address the issue of quality checks at all, although their requirements that personal data ‘should’ or ‘must’ be of a certain quality imply the need for some sort of checking system.

Data subject participation and control

A core principle of data protection laws is that individuals should be able to participate in, and have a measure of influence over, the processing of data on them by other individuals or organisations. This principle embraces what the OECD Guidelines term the ‘Individual Participation Principle’ (see para 13), though the rules giving effect to it cover more than what is articulated in that particular paragraph.

Data protection instruments rarely contain one special rule expressing this principle in the manner formulated above. Rather, the principle manifests itself more obliquely in a combination of several categories of rules. First, there are rules which aim at making people aware of data processing activities generally. The most important of these rules are those requiring data controllers to provide basic details of their processing of personal data to data protection authorities,[28] coupled with a requirement that the authorities store this information in a publicly accessible register.[29]

Second, and arguably of greater importance, are a category of rules which are aimed at making people aware of basic details of the processing of data on themselves. This category of rules can be divided into three main subcategories:

(1) rules requiring data controllers to collect data directly from data subjects in certain circumstances;



(2) rules prohibiting the processing of personal data without the consent of the data subjects; and



(3) rules requiring data controllers to orient data subjects directly about certain information on their data processing operations.

As noted above, rules falling under the first subcategory are found only in a minority of data protection instruments, though such rules could and should be read into the more common and general requirement that personal data be processed ‘fairly’. Examples of the second subcategory of rules are provided below.

As for rules belonging to the third subcategory, influential examples of these are arts 10-11 of the EC Directive which, in summary, require data controllers to directly supply data subjects with basic information about the parameters of their data processing operations, independently of the data subjects’ use of their own access rights. None of the other main international data protection instruments lay down such requirements directly.[30] National data protection laws often make this a requirement only in cases when data are collected directly from the data subject.[31] Some national laws have a notification requirement for particular kinds of data processing, such as disclosure of customer data[32] or health research, [33] though notification in such cases has been independent of whether or not the data controller has collected the data directly from the data subject. It is expected that the current notification requirements pursuant to national laws of at least EU and EEA member states will be harmonised and expanded in accordance with the EC Directive. At the same time, some of the newly enacted national laws within Europe stipulate duties of information which go beyond the prima facie requirements of arts 10-11 of the Directive. These duties of information arise in connection with certain uses of personal profiles[34] and video surveillance.[35]

There are also rules which grant individuals the right to gain access to data kept on them by other individuals and organisations. Most, if not all, data protection instruments provide such a right. An influential formulation of this right is given in art 12 of the EC Directive. This provides individuals with a right of access not just to data relating directly to them but also to information about the way in which the data are used, including the purposes of the processing, the recipients and sources of the data, and the ‘logic’ involved in certain automated data processing operations. The right in art 12 is similar to, but more extensive than, the equivalent rights found in the other main international data protection instruments.[36] None of the latter, with the exception of the UN Guidelines, specifically mention the subject’s right to be informed of the recipients of data. None mention the right to be informed of the logic behind automated data processing. Most national laws also omit specification of these rights, though the Directive should soon bring about a change in this situation — at least in Europe.

The third major category of rules are those which allow persons to object to others’ processing of data on themselves and to demand that invalid, irrelevant or illegally held data be corrected or erased. The ability to object is linked primarily to rules prohibiting various types of data processing without the consent of the data subjects. Such rules are especially prominent in the EC Directive, relative to older data protection instruments.[37] Some older instruments make no express mention of a consent require-ment,[38] while others often stipulate consent in fairly narrow contexts — for example, as a precondition for disclosure of data to third parties.[39] It is important to note that consent is rarely laid down as the sole precondition for the particular type of processing in question; consent tends to be one of several alternative prerequisites. This is also the case with the EC Directive. The alternative prerequisites are often broadly formulated, significantly reducing the extent to which data controllers are hostage to the consent requirement in practice.

A specific right to object is also laid down in some data protection laws. The EC Directive contains important instances of such a right, namely in art 14(a) (which provides a right to object to data processing generally), art 14(b) (which sets out a right to object to direct marketing) and, most innovatively, art 15(1) (stipulating a right to object to decisions based on fully automated assessments of one’s personal character). These rights to object are not found in the other main international data protection instruments.[40] Neither are they currently found in the bulk of national laws, though this situation will change in the near future — at least in Europe — largely under the influence of the Directive.

With respect to rectification rights, most data protection instruments have provisions which give persons the right to demand that incorrect, misleading or obsolescent data relating to them be rectified or deleted by those in control of the data, and/or require that data controllers rectify or delete such data.[41]

Disclosure limitation

Another core principle of data protection is that data controllers’ disclosure of personal data to third parties should be restricted so that disclosure may occur only upon certain conditions. In practice, disclosure limitation means as a bare minimum that personal data ‘should not be disclosed ... except: (a) with the consent of the data subject; or (b) by the authority of law’.[42]

The principle of disclosure limitation, like that of individual participation and control, is not always expressed in data protection instruments in the manner formulated above. Moreover, neither the CoE Convention nor the EC Directive specifically address the issue of disclosure limitation but treat it as part of the broader issue of the conditions for processing data.[43] Thus, neither of these instruments apparently recognise disclosure limitation as a separate principle but incorporate it within other principles, particularly those of fair and lawful processing, and purpose specification. The OECD Guidelines incorporate the principle of disclosure limitation within a broader principle termed the ‘Use Limitation Principle’ (para 10), while the UN Guidelines specifically address the issue of disclosure under the principle of purpose specification.

Nevertheless, disclosure limitation is singled out here as a principle in its own right because it tends to play a distinct and significant role in shaping data protection laws. Concomitantly, numerous national statutes expressly delineate it as a separate principle or set of rules.[44]

Information security

The principle of information security stipulates that data controllers should take steps to ensure that personal data are not destroyed accidentally or subject to unauthorised access, alteration, destruction or disclosure. Representative provisions to this effect are art 7 of the CoE Convention and art 17 of the EC Directive.

The principle of information security has occasionally manifested itself in relatively peculiar provisions. Especially noteworthy is s 41(4) of Denmark’s Personal Data Act of 2000. This states that for personal data which are processed for the public administration and which are of special interest to foreign powers, measures shall be taken to ensure that they can be disposed of or destroyed in the event of war or similar conditions.[45]

Sensitivity

The principle of sensitivity stipulates that the processing of data which are especially sensitive for data subjects should be subject to more stringent controls than other data. The principle is primarily manifest in rules that place special limits on the processing of predefined categories of data. The most influential list of these data categories is provided in art 8(1) of the EC Directive, which includes ‘racial or ethnic origin’, ‘political opinions’, ‘religious or philosophical beliefs’, ‘trade union membership’, ‘health’ and ‘sexual life’. Further, art 8(5) makes special provision for data on criminal records and the like. Similar lists are found in other data protection instruments at both international and national level, but these vary somewhat in scope. For instance, the list in art 6 of the CoE Convention omits data on trade union membership, while the list in the UN Guidelines includes data on membership of associations in general (not just trade unions). The lists in some national laws also include, or have previously included, data revealing a person to be in receipt of social welfare benefits.[46] References to this sort of data will, however, have to be dropped from the lists of the data protection laws of EU and EEA member states, given that the list of data categories in art 8(1) of the Directive is intended to be exhaustive.[47]

Singling out relatively fixed subsets of personal data for special protection breaks with the otherwise common assumption in data protection discourse that the sensitivity of data depends on the context in which the data are used. Accordingly, attempts to single out particular categories of data for special protection independent of their context has not been without controversy.[48] Further, not all data protection instruments contain extra safeguards for designated categories of data. This is the case with the OECD Guidelines and many data protection laws of the Pacific Rim countries. Similarly, the older data protection regimes of some European countries — notably Austria, Germany and the UK — have provided relatively little protection for such data.

The absence of such safeguards in the OECD Guidelines appears to be due partly to failure by the Expert Group responsible for drafting the Guidelines to achieve consensus on which categories of data deserve special protection, and partly to a belief that the sensitivity of personal data is not an a priori given but dependant on the context in which the data are used.[49] The absence of extra protections for designated categories of especially sensitive data in national data protection laws would appear to be due to many of the same considerations, plus an uncertainty over what the possible extra protection should involve.[50]

Lee A Bygrave, Research Fellow, Norwegian Research Centre for Computers and Law.


[1] At an international level, see for example art 5(a) of the 1981 Council of Europe Convention on Data Protection (CoE Convention) and art 6(1)(a) of the 1995 EC Directive on Data Protection (EC Directive). At a national level, see for example art 9 of Italy’s 1996 Law on Protection of Individuals and Other Subjects with Regard to Processing of Personal Data and Data Protection Principle 1 in Sch 1 to the UK Data Protection Act 1998.

[2] The case, for instance, with the OECD Guidelines (see para 7).

[3] The case, for instance, with the Norwegian Personal Data Registers Act 1978 (now repealed).

[4] The most notable exception is s 3(3) of the German Teleservices Data Protection Act 1997. Compare also principle 18 of the Australian Privacy Charter of 1994.

[5] See, for example, art 2(h) of the EC Directive.

[6] The link between fairness and transparency is made explicit in, inter alia, recital 38 of the EC Directive.

[7] The connection between fairness and non-deception is emphasised in, inter alia, s 1(1) of Pt II of Sch 1 to the UK Data Protection Act 1998.

[8] Examples of express provision are s 5(1) of Canada’s federal Privacy Act 1982, Information Privacy Principle 2 of the NZ Privacy Act 1993 and National Privacy Principle 1.4 in Sch 3 to Australia’s federal Privacy Act 1988 (as amended).

[9] This line has been taken by the UK Data Protection Tribunal. See especially the Tribunal’s decision of 24.3.1998 in British Gas Trading Limited v Data Protection Registrar (case reference unspecified). Compare also National Privacy Principle 2.1(a)-(b) in Sch 3 to Australia’s federal Privacy Act.

[10] This term is employed by the Council of Europe in several of its data protection instruments: see, for example, para 4.7 of Recommendation No R (97) 18 on the Protection of Personal Data Collected and Processed for Statistical Purposes (adopted 30 September 1997).

[11] A point noted in para 54 of the Guidelines’ Explanatory Memorandum.

[12] The US federal Privacy Act 1974 being an example. However, a requirement of erasure/anonymisation can arguably be read into other provisions of the Act: see 5 USC, s 552a(e)(1) and (5).

[13] The most far reaching requirements for transactional anonymity are laid down in ss 3(4), 4(1), 4(4) and 6(3) of Germany’s Teleservices Data Protection Act 1997. See also National Privacy Principle 8 in Sch 3 to Australia’s federal Privacy Act.

[14] See, for example, para 9 of the OECD Guidelines and Principle 3 of the UN Guidelines.

[15] See art 5(b) of the CoE Convention, art 6(1)(b) of the EC Directive, Principle 3 of the UN Guidelines and para 9 of the OECD Guidelines.

[16] Norway’s Personal Data Registers Act 1978 (now repealed) is an example here. Nevertheless, the principle was enshrined in chapters 2-3 (see especially s 3-1) of the main regulations to the Act. Compare also the principle’s relatively oblique manifestation in the federal privacy legislation of Australia and the US.

[17] As has been the case, for instance, with respect to the Norwegian legislation: see generally LA Bygrave, Personvern i praksis: Justisdepartementets behandling av klager på Datatilsynets enkeltvedtak 1980–1996 Cappelen Oslo 1997.

[18] See, for example, the OECD Guidelines and Data Protection Principle 2 in Schedule 1 to the UK Data Protection Act.

[19] See NSW Privacy Committee Guidelines for the Operation of Personal Data Systems Background Paper 31 Sydney 1977 p 3; M D Kirby ‘Transborder data flows and the “basic rules” of data privacy’ (1981) 16 Stanford Journal of International Law 27 at 46.

[20] A lonely example is s 4(2) of the Netherlands’ Registration of Persons Act 1988 (now repealed) which stated: ‘The purpose of a personal data file may not be in conflict with the law, the maintenance of public order or morality.’

[21] This has been the case, for example, pursuant to s 3(1) of Sweden’s Data Act 1973 (soon to be repealed) and s 10 of Norway’s Personal Data Registers Act (now repealed).

[22] See, for instance, s 33 of Norway’s Personal Data Act 2000 which maintains the possibility for the national data protection authority to undertake a relatively open ended assessment of licensing applications, albeit with respect to a narrower range of data processing operations than was the case under the 1978 legislation.

[23] Identical or near-identical requirements are set down in the provisions of several national laws, including art 9(1)(c) of Italy’s Law on Protection of Individuals and Other Subjects with Regard to Processing of Personal Data and Data Protection Principle 4 in Sch 1 to the 1998 UK legislation.

[24] See for instance art 5 of Switzerland’s federal Data Protection Act 1992.

[25] This is the case, for example, with para 8 of the OECD Guidelines.

[26] Similarly formulated requirements are found in several national laws: see, for instance, art 5(2) of Hungary’s 1992 Act on the Protection of Personal Data and on the Publicity of Data of Public Interest. The equivalent provision in the CoE Convention is almost identical except that it refers only to the purposes for which data are ‘stored’ (art 5(c)).

[27] See for example para 8 of the OECD Guidelines and ss 4-8 of Canada’s federal Privacy Act.

[28] See for example arts 18-19 of the EC Directive, arts 28-30 of the Hungarian Act and arts 11(2)-(3) of the Swiss Act.

[29] See for instance art 21 of the EC Directive, Information Privacy Principle 5 and s 27(1)(g) of the Australian federal Privacy Act and s 22 of France’s 1978 Law Regarding Data Processing, Files and Individual Liberties.

[30] The UN Guidelines’ ‘principle of purpose specification’ (principle 3) stipulates that the purpose of a computerised personal data file should ‘receive a certain amount of publicity or be brought to the attention of the person concerned’. Compare the more generally formulated ‘Openness Principle’ in para 12 of the OECD Guidelines:

There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller.

Articles 10-11 of the Directive are supplemented by art 21 which requires member states to ‘take measures to ensure that processing operations are publicised’ (art 21(1)) and to ensure that there is a register of processing operations open to public inspection (art 21(2)).

[31] See for example the US federal Privacy Act (5 USC s 552a(e)(3)), Information Privacy Principle 3 of the NZ Privacy Act and art 18(1) of the Swiss federal Data Protection Act (in relation to ‘systematic’ collection by federal government bodies).

[32] See for instance s 4b(2) of Denmark’s Private Registers Act 1978 (now repealed).

[33] See for instance s 40-5 of France’s 1978 legislation.

[34] Section 21 of Norway’s Personal Data Act 2000 states that when, on the basis of a personal profile, either the data subject is approached or contacted or a decision, directed at the data subject, is made then he or she must be automatically informed of the data controller’s identity, the data constituting the profile and the source of these data. A similar requirement is found in s 23 of Iceland’s Act on Protection of Individuals with regard to the Processing of Personal Data 2000.

[35] See s 40 of the new Norwegian Act and s 24 of the new Icelandic Act. These provisions extend to surveillance operations in which personal data are not actually registered or stored (for example, on film).

[36] See art 8 of the CoE Convention, paras 12-13 of the OECD Guidelines and principle 4 of the UN Guidelines.

[37] See especially art 7(a) of the Directive which stipulates consent as one (albeit alternative) precondition for processing generally.

[38] This is the case with the CoE Convention.

[39] See for example para 10 of the OECD Guidelines, s 4(2) of the Danish Private Registers Act (repealed) and art 19(1) of the Swiss Act.

[40] Compare principles 5.5, 5.6, 6.10 and 6.11 of the ILO’s 1997 Code of Practice on Protection of Workers’ Personal Data which seek to limit the use of automated decision-making procedures for assessing worker conduct.

[41] See for example art 12(b) of the EC Directive, Principle 4 of the UN Guidelines, s 14 of the UK Act, art 13(1)(c) of the Italian Act and Information Privacy Principle 7 of the NZ Act.

[42] Paragraph 10 of the OECD Guidelines.

[43] See especially arts 5(a), 5(b) and 6 of the Convention, and arts 6(1)(a), 6(1)(b), 7 and 8 of the Directive.

[44] See for example the US federal Privacy Act (5 USC s 552a(b)-(c)), s 8 of Canada’s federal Privacy Act, and Information Privacy Principle 11 in both the NZ Privacy Act and Australia’s federal Privacy Act.

[45] A similar rule was found in s 12(3) of Denmark’s Public Authorities’ Registers Act 1978 (now repealed) and s 29 of the Icelandic Protection of Personal Records Act 1989 (also repealed).

[46] See s 6(6) of Finland’s Personal Data Registers Act 1987 (now repealed), s 4(2) of Sweden’s Data Act 1973 (soon to be repealed) and art 3(c)(3) of the Swiss federal Data Protection Act.

[47] See further the discussion of this point in Bygrave L A Data Protection Law: Approaching Its Rationale, Logic and Limits Faculty of Law Oslo 1999, ch 18, section 18.4.3.

[48] For a forceful, highly persuasive critique of such attempts, see Simitis S ‘“Sensitive daten” – zur geschichte und wirkung einer fiktion’ in Brem E, Druey J N, Kramer E A and Schwander I (eds) Festschrift zum 65. Geburtstag von Mario M. Pedrazzini Stämpfli & Cie Bern 1990, pp 469-493.

[49] See the Guidelines’ Explanatory Memorandum paras 43 and 51.

[50] See for example Law Reform Commission of Hong Kong, Report on Reform of the Law Relating to the Protection of Personal Data Government Printer Hong Kong 1994 pp 99 and following; Australian Law Reform Commission (ALRC) Privacy Report No 22 AGPS Canberra 1983 vol 2 paras 1218 and following.

Download

No downloadable files available