Privacy Law and Policy Reporter
This article is a revised version of a presentation to a plenary session on ‘New Technologies, Security and Freedom’ at the 22nd Annual Meeting of Privacy and Data Protection Officials held in Venice, 27-30 September 2000.
When I had the privilege of giving the keynote address at the annual meeting of this group in Quebec City in 1987, I chose to look towards the future in data protection, with a focus on the then distant year 2000. I never dreamed that in this millennial year I would be standing before you, or have had all of the dramatic change that we have experienced in the 1990s as individuals, as official data protectors and as privacy advocates. I do remember, with some misgivings, my fear that by the year 2000 official data protectors would be reduced to a tiny ragbag of individuals with pitchforks trying to hold back the forces of surveillance. Since I remain an optimist, I do not think that our current situation has reached quite that unfortunate state, but the ‘privacy police’, as I am fond of calling you these days, have very finite resources when it comes to monitoring implementation of data protection.
I do want to return to one of my supposedly controversial points in 1987 — the privacy watchdog analogy — because I think it has stood the test of time, despite the public remonstrations of the then president of the Commission Nationale de l’Informatique et des Libertes (CNIL), who took great umbrage at my use of the term ‘watchdog’. Having had the experience of serving as the first Information and Privacy Commissioner for the Province of British Columbia in Canada (1993-1999), I strongly believe that the conception of the data protection commissioner as a privacy watchdog remains a very powerful and relevant image, reminding me, at least, of the continued inadequacies of countries that do not have such independent watchdogs in place.
The realities at the dawn of the 21st century are that privacy and data protection commissioners, and indeed privacy advocates themselves, are facing a continuing stream of technological innovations that have to be evaluated systematically to measure compliance with the fair information practices or data protection principles that are at the heart of all data protection legislation. That problem is the focus of this first plenary session. Data protectors are facing such arduous responsibilities in the face of an increasing work burden, more and more complex and bureaucratic legislation such as the European Directive on Data Protection and its national clones, and a very fast pace of technological innovation. Understanding any such change can be a complex activity that data protectors will wish to approach in a systematic manner.
What I intend to draw to your attention is an additional tool in the arsenal of the data protector in the form of privacy impact assessments. The idea is to require the preparation of privacy impact assessments for new products, practices, databases and delivery systems involving personal information. In the last five years, privacy specialists have developed an assessment model for the application of a new technology or the introduction of a new service which has good potential for raising privacy alarms at an early stage in an organisation’s planning process in either the public or private sectors. Various models exist for privacy impact assessments that can be customised to the needs of any organisation. The essential goal is to describe personal data flows as fully as possible so as to understand what impact the innovation or modification may have on the personal privacy of employees or customers and how fair information practices may be complied with. Ultimately, a privacy impact assessment is a risk assessment tool for decision-makers that can address not only the legal, but also the moral and ethical, issues posed by whatever is being proposed.
What I am proposing — and it will not be a novel suggestion for those of you from North America and New Zealand in particular — is that privacy regulators require, or at least encourage, those being regulated to prepare a privacy impact assessment for significant personal data systems that are new or enhanced in some significant way, so that their privacy implications can be analysed and addressed in a coherent manner. This idea of using privacy impact assessments is an emerging tool for addressing certain types of data protection problems that was pioneered, in my opinion, by New Zealand and certain Canadian provinces, including Ontario, BC, and Alberta, during the last half decade.
I realised at Stewart Dresner’s superb Privacy Laws and Business conference in Cambridge in July 2000 that whatever other forms of progress in data protection (such as auditing) have occurred in Europe recently, the concept of a privacy impact assessment as an instrument of data protection has not visibly taken root. I believe that the preparation of a privacy impact assessment, in co-operation with a data protection office, can be extremely useful in helping to avoid an overly legalistic, even Talmudic or Jesuitical, focus in the detailed work of privacy protection. That is because the core of an effective privacy impact assessment is a careful description of how a system (or any application of technology to personal information) actually works. In this process, specific privacy issues can be segregated and addressed in a comprehensive manner. Conducting a privacy impact assessment is also an effective method of engaging a team of persons at any organisation, including technology, policy, legal and privacy specialists, to work together to identify and resolve data protection problems.
Simply put, a privacy impact assessment seeks to set forth, in as much detail as is required to promote necessary understanding, the essential components of any personal information system or any system that contains significant amounts of personal information. I find it easiest to indicate what I have in mind by listing the generic categories of information that should be considered for inclusion in an informative and informed privacy impact assessment (see Table 1 on p 87).
Issues of definition and description of the central components of a privacy impact assessment also involve initial questions of whether an organisation really needs to prepare one in specific circumstances. In the spring of 1999, as Information and Privacy Commissioner for British Columbia, I had to deal with an issue involving detailed patient waiting lists for many hospitals in the lower mainland and Vancouver Island, arranged by reference to the relevant specialist. The advice of my staff was that a privacy impact assessment was not necessary, but I was concerned about the accuracy of the information about the medical practices of individual physicians and whether physicians themselves had agreed to, or were at least aware of, the personal data to be disseminated in the context of their patient waiting lists. The British Columbia Ministry of Health was reluctant to do the work involved but relented over a weekend and prepared a privacy impact assessment for our review within several days. Even the Deputy Minister of Health attended the discussion of the privacy impact assessment at our office with my staff. Since we were quite satisfied with the resulting document, we approved it at once and suggested to the Deputy Minister that he post the privacy impact assessment on the Ministry of Health’s website with the announcement of the waiting list registry, which, ironically, happened the next day (because of the politics of waiting lists for physician services). 
If specialised staff of a data protection office have done their homework with their counterparts in organisations, then significant changes in personal information systems will automatically surface and receive appropriate attention, up to and including the most senior staff of the office, including the privacy commissioner. I think that it is fruitless to state upfront that a privacy impact assessment is always required, because in my experience it is quite difficult to make such a decision at an early stage in the development of any system. A better approach in my view is simply to indicate to organisations that privacy impact assessments are highly desirable for significant changes to existing personal information systems or the creation of new ones. Ideally, those responsible for central government oversight of compliance with an Act will ensure that organisations prepare such privacy impact assessments on their own initiative, which can ultimately be reviewed by central government and the privacy commissioner’s office at an appropriate later step in the process. A similar model can work in the corporate world. A data protection office has to delegate as much work as possible in order to avoid being swamped.
Organisations must prepare privacy impact assessments in such a manner as to identify key problems, not try to gloss over them or skip by them, since the specialists in the offices of privacy commissioners will focus on them in the long term. I admire the ‘true believers’ who are advocating various enhanced information systems for seemingly laudable purposes, since what they are proposing is clearly in the public interest, but privacy impact assessments must be written with a more critical eye to the sensitive issues. The hard questions must be answered and not glossed over. ‘Solutions’ to such issues as consent, for example, will probably also be transferable from one privacy impact assessment to another if the thought processes of the team involved are insightful and creative.
A variety of informed groups in Canada and the United States have prepared detailed guides on how to prepare privacy impact assessments. These include the US Internal Revenue Service, Treasury Board Canada (which oversees the federal government’s central administration of compliance with the Canadian federal Privacy Act) and the Ontario Management Board of Cabinet (which plays a comparable role with respect to Ontario’s Freedom of Information and Protection of Privacy Act). In British Columbia, the Information, Science, and Technology Agency (ISTA) and the Office of the Information and Privacy Commissioner have published model forms for the completion of privacy impact assessments. My former office prides itself on the model and detailed worksheet, including critical questions, that it has created for those preparing a privacy impact assessment.
My major criticism of the existing guides to conducting privacy impact assessments is that they violate the KISS principle; that is, ‘keep it simple, stupid’. They give the appearance of being too complicated and burdensome for the users at organisations that will be asked to do the actual work. My sense is that looking at some of these forms and the listed requirements would be a discouragement to co-operation in what is, after all, a largely voluntary activity on the part of those being regulated. Suggestions and guidance have to be as user-friendly as possible, which I think the ISTA forms referred to above have achieved to a considerable measure, as have those of my former Office. There is no use trying to persuade busy bureaucrats to assist the task of effective implementation of data protection by filling out privacy impact assessments and then burdening them with so much complex guidance that would try the patience and willingness to follow through on the process of even the most tolerant among them.
As a privacy and information policy consultant working primarily in Canada during the past 14 months, I have found that the preparation and encouragement of privacy impact assessments is one of the services that I can offer to clients in the public and private sectors. In particular, I have prepared a substantial privacy impact assessment for a federal-provincial effort in the public health surveillance field that features an internet display tool for making available appropriate, timely and relevant data to public health officials.
My direct involvement in the preparation of this privacy impact assessment leads me to make the following observations about the process.
This particular privacy impact assessment has been expensive to execute and difficult for me to accomplish in practical terms, starting from the fact that I came from outside the project team and was not one of the developers or proponents. In theory, it would be preferable for someone inside such a project to draft a privacy impact assessment and keep it up to date, but the lack of readily available models and privacy expertise to date has made that approach difficult for any organisation.
I have spent more than 100 hours on this project and produced a 39 page privacy impact assessment with literally hundreds of footnotes to the supporting documentation (the anonymised table of contents is in Table 1). But the cost of the preparation of this privacy impact assessment was less than 1 per cent of the development costs for the complex delivery system.
From the beginnings of system design several years ago, the proponents had every intention of complying with privacy, confidentiality, and security requirements and legislation. But in my judgment the burdens of building the innovative system (with the central help of an IBM Global Services team) meant that this commitment smacked of lip service in terms of the contents of the substantial project reports that I was originally able to review and that served as the basis for my privacy impact assessment.
The project development team itself lacked the trained resources to prepare a proper privacy impact assessment and to resolve critical data protection issues in a systematic manner (although it made a series of correct ad hoc decisions on data protection issues).
Lest I appear to be overly critical as an outsider, let me acknowledge my admiration for, and empathy with, these system designers and project sponsors and all the challenges that they had to overcome. I learned from them to appreciate much more the sheer difficulties of building a sophisticated and innovative data collection and data display system.
I also learned that there was no use building a system that was so privacy compliant, in terms of disclosure avoidance practices in particular, that it would be of absolutely no use to the public health professionals who are the sole intended users. Some pragmatic rules and solutions needed to be found that would serve all sides of the public good. A cost-benefit analysis and a privacy impact assessment are useful vehicles for balancing competing interests.
In the first instance, I based the draft privacy impact assessment on literally thousands of pages of documentation prepared by those building the system. For reasons that are not totally clear to me, the relevant literature was given to me in dribs and drabs, leading me to reflect after the fact that I was participating in some kind of dance of the seven veils. Those promoting and executing a project need to document their activities as much as possible, so that those following in their footsteps, such as in the preparation of a privacy impact assessment, can understand as much as they need to know of how the system operates and the levels of personal microdata involved at each stage of creation, use and disclosure of the data.
In my judgment, a basic function of a draft privacy impact assessment is to ask probing, detailed questions of the proponents, builders and designers in order to promote comprehension. The role is in effect that of a devil’s advocate.
One definite mistake that I made was in not obtaining a demonstration of the system at an earlier stage in my work. That mistake reflected issues of costs and federal-provincial politics, or at least my limited understanding thereof. I conclude that the ideal privacy impact assessment of any project is prepared by someone from inside the project and with an up-front demonstration of just how it works or is supposed to work. On the other hand, my experience with another national agency in Canada is of being asked to criticise privacy impact assessments that staff have prepared. To date, I have found them lacking in sophistication and skipping over large and small data protection issues — which admittedly can be problematic to deal with in a bureaucratic world where everyone seems to have too much work to do. Internal advocates of innovative systems are naturally reluctant to be too critical of their scheme. My argument is that the best protection for such a project is for the difficult data protection questions to be posed and then answered by means of appropriate solutions as required.
My fear is that it is always going to be difficult to find someone building any automated system who knows enough about data protection principles and fair information practices to be able to apply them in a sophisticated manner to the project in question. The evidence is that few persons understand intuitively what fair information practices are all about.
Executing a successful privacy impact assessment for any application also presupposes a capacity to understand and explain security practices in a manner that the lay reader of any privacy impact assessment will be able to understand. Cutting through jargon is an essential task of the activity.
A related technical issue is the all important one of disclosure avoidance practices. It is one thing for a critic to raise specific privacy issues around such questions as the risk of re-identification in the conduct of research and statistical uses of information, for example, but it is much more difficult to measure the real risks and then to decide how to manage them in a reasonable manner. These are methodological issues that require technical assistance from specialists.
The primary purpose of a privacy impact assessment is to allow the organisation building or operating a personal information system to decide whether it complies with relevant data protection legislation at any particular stage in time. An important secondary goal is to meet the privacy expectations of the public with respect to moral and ethical considerations. The office of a privacy or data protection commissioner has crucial roles to play in both activities.
A secondary purpose of a privacy impact assessment is to serve as an educational and negotiating tool for the system operators to use for purposes of compliance reviews by senior management and by the external data protection agent or agency. The privacy impact assessment should make it relatively easy for executives and the privacy commissioner and his or her staff to understand how the system works and what the privacy issues and risks are, if indeed there are any. That is why I favour a sophisticated approach to the contents of a privacy impact assessment that delivers all of the necessary details and does not skirt over real issues. The completion of an effective and meaningful privacy impact assessment requires a dialogue (not a diatribe) between the regulator and the regulated.
I would like to take issue with the view that a privacy impact assessment cannot be used to obtain a waiver of, or relaxation from, any requirement of relevant data protection legislation. That should be possible in a practical sense that reflects political reality and real costs to taxpayers in particular. Fair information practices need to be customised to work in practice. For example, when I was informed that it would cost half a million dollars for the Workers’ Compensation Board in British Columbia to replace the use of social insurance numbers to keep track of workers in the province whose hearing was tested regularly over a period of years, I agreed that the cost was excessive in terms of the benefits of linking the testing records by an efficient method.
A privacy impact assessment is a protean document in the sense that it is likely to continue to evolve over time with the continued development of a particular system. This is one of its most important characteristics, since the privacy impact assessment can be used to monitor important changes in any system, especially those with potentially negative implications for the privacy of individuals. It is an early warning system for management and responsible ministers or executives.
I urge public bodies and other organisations in the private sector to post any privacy impact assessment on their website so that it is available to anyone and everyone, including privacy advocates who may wish to second-guess the choices that have been made. An effective privacy impact assessment can also be a guide to others seeking to emulate a particular application, especially within the complex federal, provincial and territorial political system in Canada.
One of the perhaps semantic issues with a privacy impact assessment is whether or not a privacy commissioner really has to approve the finished product. The model process for a privacy impact assessment, based on my experience, is for the staff of the privacy commissioner and the staff of the public body to meet and discuss planned innovations in information systems. If the matter is significant enough, the initial meeting may be with the commissioner, who will naturally express a strong interest in wanting to fully understand how the personal information system will work in practice in the form of a developed privacy impact assessment. My repeated experience was that it took a lot of staff time and persistent effort to figure out the flows of personal information in any information system, especially if, as in one instance, the BC Ministry of Human Resources was proposing to have routine access to the central client registry of the Medical Services Plan of the BC Ministry of Health for selected purposes. My considered view is that at the end of the day, any ministry or organisation has the right to be told that it is acting in compliance with the data protection legislation if the staff executed their plans according to the privacy impact assessment developed in co-ordination with the commissioner’s staff. I know that it is customary in such instances to suggest that the commissioner’s views are subject to later revision on the basis of new information or a privacy complaint, but in my six years of experience in British Columbia we never really had to second-guess ourselves with respect to matters of advice giving on a privacy issue. If privacy concerns are to be taken seriously by public bodies and other organisations that are privacy intensive in terms of their use of personal information, then they have a right, after the exercise of due diligence on both sides, to positive expressions about the privacy impact assessment from the privacy commissioner.
I am persuaded on the basis of direct experience that a successful privacy impact assessment can be a very effective instrument in the toolkit of the 21st century data protection commissioner. It can also be very helpful to senior public servants and their elected Ministers who do not wish to be blindsided by privacy disasters, such as happened to the Canadian Minister of Human Resources Development in May 2000. A proper privacy impact assessment that incorporated the informed observations of the Office of the Privacy Commissioner of Canada might have prevented a political and public relations disaster for that particular Minister and the federal Liberal Government.
David H Flaherty, PhD is a Professor Emeritus, University of Western Ontario, and the former Information and Privacy Commissioner of British Columbia. He is now a privacy and information policy consultant.