Skip to main content Skip to navigation

JILT 2004 (3) - Diane Rowland

Data Retention and the War Against Terrorism – A Considered and Proportionate Response?

 

Professor Diane Rowland
Faculty of Law, University of Wales, Aberystwyth
Diane.Rowland@aber.ac.uk

 

This paper is a substantially revised and updated version of the paper given at the 2003 Bileta Conference.


Abstract

Data protection law has been a feature of the UK legislative landscape for 20 years and has arguably had a significant impact on the way in which personal data are held and treated. Although the law does not of itself proscribe personal profiling, data matching and data sharing, controls which have been introduced on the circumstances in which this can lawfully take place have introduced a greater awareness of the possibilities of damage arising out of inappropriate processing. Data matching by government has arguably been treated less stringently in some states than that which occurs in the commercial world and, in the UK, specific enactments expressly allow data matching for a number of purposes independently of the wishes and/or expectations of the data subject[i]. Nonetheless, the now generally accepted principles of informational privacy have been challenged by recent developments advocating data retention by communications service providers for far longer periods than are required for business purposes, and allowing subsequent use of that data for a variety of enforcement purposes not contemplated at the time of collection. On the other hand, the collective interest in a safe society in the context of the perceived threat from terrorism, especially since September 11th has been used to justify this response. This paper will consider the proportionality of the introduction of data retention in the Anti-Terrorism, Crime and Security Act 2001 and the potential effect of the Code of Practice on Data Retention, brought into effect in December 2003, in the context of the existing data protection laws.

Keywords: Informational privacy, data retention, the Anti-Terrorism, Crime and Security Act 2001, the Code of Practice on Data Retention, data protection laws.


This is a refereed article published on: 30 November 2004.

Citation: Rowland, 'Data Retention and the War Against Terrorism – A Considered and Proportionate Response?’, 2004 (3) The Journal of Information, Law and Technology (JILT).<http://www2.warwick.ac.uk/fac/soc/law/elj/jilt/2004_3/rowland/>


1. Introduction

Within the burgeoning discipline of information technology law, data protection law provides a rare example of legislative foresight[1]. As computer technology began to become an accepted feature of the communications landscape, warnings about potential invasions of privacy began to be sounded. In the US there were a number of early observations and pertinent comments about the privacy concerns raised by computers[2]. However, much of this debate arose as an adjunct to discussion of the privacy threats of more physical methods of surveillance in pursuit of the law breaker, suspected felon and perceived troublemaker[3]. It is beyond the scope of this article to chart the different paths which the development of the law took on both sides of the Atlantic but in the early stages the concerns identified by US scholars were certainly mirrored in the UK [4]. In its report of 1972, the Younger Committee on Privacy, although not considering that the use of computers presented a threat to individual rights at that time, nevertheless, in readiness, formulated ten principles designed to take into account the interests of those whose data were processed by computer [5]. Although the actual quantity of personal data which, within only a few years, would be processed by computer and available over computer networks was probably never envisaged at the time of Younger, in Europe at least, the seeds of data protection, as distinct from wider concerns about the impact of surveillance on human rights and personal autonomy, were sown early allowing them to germinate before the problems themselves had fully matured. This allowed legislation to be in place when problems did begin to manifest themselves albeit the emerging law did not sufficiently foresee the difference threats consequent on internet technology and the legacy is legislation which is arguably much more relevant to a previous technological era. Notwithstanding the genesis of the debate on both sides of the Atlantic, instruments such as the Council of Europe Convention on the processing of personal data in 1981[6] and eventually the Data Protection Directive in 1995[7] were to result on a much more proactive approach within Europe[8] . This has created different problems to some other areas of law where legislatures have or have attempted to legislated in a more reactive way. As a contrasting example, legislative enactments about computer crime in a number of jurisdictions, such as the Computer Misuse Act 1990 in the UK, were a reactive response to the problems caused by computer hacking and the use of the computer as an instrument for the commission of offences. Interestingly, although these statutes emerged to counteract observed problems, this was not always accompanied by increased success in detection and apprehension of wrongdoers. Illustrative of this is that the Computer Misuse Act, referred to above, has been much criticised for being underused and ineffective[9] whereas there has been rather more enforcement and compliance activity [10]in relation to data protection and a gradually increasing awareness of the nature of the provisions on behalf of both individual data controllers and data subjects. Until recently there has been no serious attempt to either dilute or devalue the rights given to data subjects by successive data protection statutes, except as provided within the exemptions in successive data protection statutes. Notwithstanding the narrow construction placed upon the definitions of ‘personal data’ and ‘relevant filing system’ by the Court of Appeal in Durant v Financial Services Authority[11] in principle, the 1998 Act, implementing Directive 95/46/EC, both amplifies and augments the rights previously bestowed in the 1984 Act. The ongoing enhancement of individual rights of informational privacy has been replicated in other jurisdictions[12] but, as referred to below, many are resiling from this position in the face of the perceived terrorist threat.

2. Purpose and Philosophy of Data Protection

Data protection laws are often justified and explained as an attempt to counter the perceived threat to informational privacy consequent on the increasing use of computers. Although frequently now discussed in terms of the rhetoric of rights, the original objectives of data protection rules were arguably much less aspirational. The Data Protection Act 1984 was enacted not so much as to provide fundamental rights for data subjects, as to protect Britain’s international trade which was feared to be under threat if the Council of Europe Convention on the protection of personal data could not be ratified. The Convention contained a provision[13] allowing ratifying States to prohibit transborder personal data flow if the state to which the data were to be transferred did not provide equivalent protection for the personal data. The UK government was afraid that, if it failed to ratify the Convention, it could be categorized as unsuitable to receive personal data, with the consequent possibility of an adverse impact on trade. Despite the pragmatic reasons behind the original statute, it was eventually heralded in some quarters as a statute which was not merely another facet of business regulation.[14] The central core of responsibilities is found in the data protection principles originally formulated in the 1984 Act and retained in amended form in the 1998 statute although, as intimated above, their genealogy precedes both of these enactments[15]. In a nutshell, they delineate the boundaries of lawful processing[16]. They require a proportional response to balance the interests of the data subject and the necessary usage of the data controller.

Thus processing of personal data must be notified to the Information Commissioner, personal data can only be collected and further processed for defined purposes and the amount of data must be commensurate with those purposes, must be accurate, kept up to date, not kept longer than necessary and, importantly, whether processing takes place on the basis of consent or some other legitimisation of processing[17], the data subject should be made aware of the purposes and extent of processing. Although myths and false presumptions abound about the requirements and impact of data protection laws, within the given framework, the objective and intention of the legislation is not to stifle genuine and justifiable use of the data by the data controller. Indeed the criteria listed in schedules 2 and 3 to legitimise processing recognise this by not requiring consent where processing is necessary for certain specified reasons[18].

It is also for the controller to designate the purposes of processing and so on. More stringent rules apply to the processing of sensitive data[19], but there is no blanket ban on either the collection, or retention[20], of particular types of data. The basic requirement is that there is a legitimate reason for the processing, or prohibition on sharing of personal data with others and that the data subject is aware of the categories of disclosure which might occur. The original concept of a total package of protection from the type of intrusion into personal privacy that computerisation makes possible has recently been underlined by the interpretation of ‘relevant filing system’ in Durant.[21] This decision restricts the application of the Act to those filing systems from which information can be retrieved as speedily as a computerised search. This will drastically reduce the number of manual filing systems which are subject to the Act and in practice could effectively almost confine its application to personal data held on computer.

3. Existing Exemptions from Data Protection Laws

As with all laws that control information, there are a number of exemptions from some or all of the requirements of the statute. These range from applications which are seen to pose little threat to the individual, making the application of the full rigour of the Act disproportional, to those which are exempt because there is some other public interest in disapplying all or some of its provisions. In between these extremes are a number of types of processing which are subject to partial exemptions from, or modifications of, the Act’s provisions. So, for example, there are a number of statutory instruments modifying the rules with respect to the processing of data about physical and mental health[22] and personal data processed for research purposes are exempt from the second and fifth data protection principles and the subject access provision in s7, subject to certain provisos[23].

One of the subjects which can be exempted from the need to comply with the majority of the individual safeguards provided in the Act is personal data processed for the purposes of national security. These rules were intended to provide a sufficient balance between the requirements of individual privacy and the needs of law enforcement and state security. The exemptions for national security (s28) and for crime (s29) can be contrasted in that s28 is capable, in principle, of providing exemption from virtually all of the provisions which provide protection for the individual data subject. Specifically, it can provide exemption from the Data Protection principles, Parts II, III and V of the Act (dealing with rights of the data subject, notification and enforcement respectively) and also s55 which creates certain offences with respect to lawful obtaining ‘if the exemption from that provision is required for the purpose of safeguarding national security’. Evidence that an exemption on national security grounds is necessary is provided by a system of certification by a Minister[24]. If even the data protection principles are deemed inapplicable to personal data processed for national security purposes, there can be no assurance that the processing will be fair or that other guarantees will be provided related, for instance, to adequacy and relevancy. Accepting that there might be corresponding problems with enforcement and the provision of remedies, it is difficult to see what would be lost by requiring adherence to the principles, especially those relating to fair and lawful processing for the purposes for which the data were collected, other than the ability to be cavalier with the personal data of others. However, the wording of the exemption does suggest that exemption should not be granted if compliance with the Act is possible without prejudicing national security, so that, in theory, there is no automatic blanket exemption. In contrast, the crime exemption exempts only from the first data protection principle (except to the extent that it requires compliance with the conditions in Schedules 2 and 3) and s7 which provides for the rights of subject access. Further, this exemption only applies to the extent that the application of those provisions would be likely to prejudice the prevention or detection of crime. So, in many cases, the full force of the Act will apply, and in all cases the police will be required to process personal data in conformity with the majority of the principles, and specifically that on data retention. Remedies are also available to those whose rights have been compromised[25].

The scope of the permissible exemption on national security grounds was considered by the UK Information Tribunal (National Security Appeals) in a case involving the MP, Norman Baker[26]. In this case Jack Straw, as Home Secretary, had issued the requisite certificate under s28(2)[27]. The certificate did not exempt from all the permissible parts of the statute for which exemption can be given (and in particular did not include exemption from principle 5), nevertheless, as noted by the tribunal, its effect could ‘fairly be described as a blanket exemption’ and that although it was issued ‘for the purposes of safeguarding national security’ its use was not limited to cases ‘where national security is engaged’.[28] The case centred around a data subject access request and so the analysis is in terms of the denial of this right by virtue of a standard ‘neither confirm nor deny’ response. The salient questions were whether this interfered with the right of respect for private life, whether any such interference was in accordance with the law, pursued a legitimate aim and was necessary in a democratic society. In other words was it proportionate?[29] It was suggested that this latter attribute was the key issue[30] and any such analysis had to address the difficult question of ‘when does national security take precedence over human rights?’[31]The tribunal’s analysis[32] emphasised the need to apply strict criteria of proportionality in cases of derogations from the rights provided to data subjects, particularly in relation to national defence and security. In particular, it concluded that a blanket exemption was wider than necessary to protect national security indeed that it could conceive of ‘no positive reason for giving a blanket exemption to all processing by the service’; that the blanket exemption removed any obligation to consider each case on its merits especially as there would certainly be some personal data held which could be released without prejudicing national security; that there was no reason to suppose that the burden of responding to individual requests would be onerous; reference to the practice in other countries did not reveal anywhere with a similar ‘unchallengeable exemption’. In sum, a proportionate and reasonable response would involve ‘a situation where each case is considered on its merits.’

4. The New Regime

The Data Protection Act 1998, which implements Directive 95/46/EC is of general application. Amplification of the data protection rules for the communications sector was later accomplished in Directive 97/66/EC[33] which referred to the increasing risk connected with automated storage and processing of communications networks and the need for users to be assured that privacy and confidentiality would not be compromised. This directive has now been repealed and replaced by Directive 2002/58/EC on privacy and electronic communications[34] which makes some potentially far-reaching changes relating to data retention. Article 15 of this Directive now allows Member States to adopt legislative measures for the retention of data for a limited period if these are necessary to safeguard national security, defence, public security and the prevention, investigation, detection and prosecution of criminal offences etc., although any such measures are required to comply with the general doctrines of EC law such as proportionality and respect for human rights. If Article 15 is not invoked then the rules require mandatory destruction of data by virtue of Article 6. It appears that the implicit inference from Article 15, together with the fact that all of these directives are not intended to apply to those areas of law not within Community competence, is that matters of security and law enforcement are likely always to take precedence over matters of individual privacy, but that this should be neither an automatic presumption nor an inevitable outcome. The new provision is widely drafted and potentially makes little distinction between the action which may be taken in response to extreme terrorist activity and more routine criminal behaviour. This makes its difficult to apply the doctrine of proportionality which is at the heart of EC law and which requires a balance of risks and consequences. Further, ISPs will bear a significant burden as a result of the new rules in terms of responsibility for the tracking and retention of the relevant data. A joint statement of the European Internet Services Providers Association (EuroISPA) and its US counterpart (US ISPA) on 30 September 2002[35] expressed the view that ‘Governments have not sufficiently demonstrated that the absence of mandatory data retention is detrimental to the public interest’, that mandatory data retention in the absence of any business purpose would ‘impose serious technical legal and financial burdens on ISPs’ and that ‘privacy, due process, transparent procedures and fair and equal access … should not be jeopardized unless there is a compelling and lawful need.’ They call for the replacement of data retention laws with rules on data preservation which, based on the G-8 definition, does not include prospective collection of data and neither does it require ISPs to collect and retain data not required for ordinary business purposes. Data preservation requires the retention of data only on specific individuals who are under investigation and is the technique currently espoused in the US[36]. The difference between the two is not just one of degree, but of principle. As explained by Crump ‘data retention aims to change the context of internet activity. … Data retention ‘rearchitects’ the Internet from a context of relative obscurity to one of greater transparency. This manipulation of context influences what values flourish on the Internet. Specifically, data retention, by making it easier to link acts to actors, promotes the value of accountability, while diminishing the values of privacy and anonymity.’[37] In the UK the use of data preservation has been specifically rejected as an alternative to data retention, albeit not without dissent.[38] The basis for this is that data preservation is only useful in the case of known suspects whereas data retention might assist in those cases where an individual subsequently comes under suspicion, although no explanation has been given as to how the vast quantity of data retained will be analysed.

Those who support the regime argue that the objective of combating terrorism is an overriding one which can justify even significant inroads into the privacy of the innocent, if they even acknowledge that the new rules are capable of having that effect. A further argument is that the rules are concerned with the retention of traffic data and not content but, in the context of the internet, traffic data can include e.g. a list of URLs visited which can easily be correlated with actual content and as pointed out by Bowden ‘traffic data constitutes a near complete map of private life’[39]. A more technical argument is that such an outcome is not inevitable as the wording of Article 15 is permissive, not mandatory, but independently a number of Member States, including the UK, have already made provisions for data retention and there is apparently now in existence a draft Framework Decision on the retention of traffic data and access for law enforcement agencies.[40] The purpose of this draft decision is to ‘make compulsory and harmonise the a priori retention of traffic data in order to enable subsequent access to it, if required, by the competent authorities in the context of a criminal investigation.’ The preamble makes reference to the right to privacy but asserts that ‘a period of a minimum of 12 months and a maximum of 24 months for the a priori retention of traffic data is not disproportionate in view of the needs of criminal prosecutions as against the intrusion into privacy that such a retention would entail.’ However there are no details of how these figures are arrived at, what manner of risk assessment has been carried out etc. The draft decision further defines the categories of data which can be retained namely that necessary to:

  • follow and identify the source of a communication
  • identify the destination of a communication
  • identify the time of a communication
  • identify the subscriber
  • identify the communication device

These measures taken together would create some fairly drastic changes in the approach to the protection of privacy on-line but have not been accepted with approval in all quarters. The issues were aired in March 2003 when the Committee of Citizens’ Freedoms and Rights, Justice and Home Affairs of the European Parliament held a public seminar entitled ‘Data Protection since 11 September 2001: What strategy for Europe?’ Topics for consideration were the need to strike the appropriate balance between the requirements of freedom and security in the light of ‘undifferentiated access to data of all kinds in order to detect threats of terrorism and organised crime at the earliest possible stage.’[41]

The Article 29 Data Protection Working Party in its Opinion[42] on the need for a balanced approach in the fight against terrorism called for the ‘need to establish a comprehensive debate on the initiative to fight terrorism and the fight against criminality in general, as well as limiting the procedural measures which are invasive to privacy to those really necessary.’ It was particularly concerned about the long term impact of what could be described as ‘knee-jerk’ policies and reactions, especially in the light of the fact that ‘terrorism is not a new phenomenon and cannot be qualified as a temporary phenomenon.’ It called for any legislation which limited the right of privacy to be sufficiently clear in ‘its definitions of the circumstances, the scope and modalities of the exercise of interference measures’ many of which, as we have already seen, are potentially very wide and even uncertain in scope. Finally it concluded that ‘measures against terrorism should and need not reduce standards of protection of fundamental rights which characterise democratic societies. A key element of the fight against terrorism involves ensuring that we preserve the fundamental values which are the basis of our democratic societies and the very values that those advocating the use of violence seek to destroy.’ These concerns are echoed in a statement from the European Data Protection Commissioners noting with concern the proposals for systematic retention of traffic data and expressing ‘grave doubt as to the legitimacy and legality of such broad measures’ and going as far as to say that ‘systematic retention of all kinds of traffic data for a period of one year or more would be clearly disproportionate and therefore unacceptable in any case.’[43]

In the UK, the Anti-terrorism, Crime and Security Act 2001 contains provisions which allow communications service providers to retain data about their customers’ communications for national security purposes. It allows for setting of codes of practice in consultation with the Information Commissioner but also gives the Secretary of State powers to make orders connected with the retention of communications data directed to communications providers generally, specifically or particularly.[44] The resulting Code of Practice on Data Retention which was brought into effect in December 2003[45] is a voluntary code for ISPs. The first draft of this Code, issued for consultation by the Home Office in March 2003[46]. was described as being drafted in consultation with the Information Commissioner despite the fact that the Commissioner remains ‘unconvinced that there is a need for communications service provider to retain data routinely for the prevention of terrorism for any longer than data would normally be retained for business purposes’. The debates in the Standing Committee on Delegated Legislation, which considered this order, demonstrate the polarisation of views on this topic between those who are convinced of the necessity of data retention as a tool to fight terrorism and those who believe that its use in this context is entirely disproportional[47].

It may be useful at this point to consider how the original statutory framework could be applied to data retention in the absence of further statutory overlay. Where none of the exemptions applies, personal details relating to all computer traffic data should be erased when no longer required for the purpose for which they were collected or, if retention is desired for any reason, anonymised. Further the nature of the data protection rules is that customers should be apprised of the way in which their data is processed and for what purposes. The primary purpose for ISPs and communications service providers to collect and retain any traffic and/or communications data is for billing purposes. Once the bill has been issued and payment received the data has no further use for business purposes. In addition, such data may not even be collected if the user opts for a pre-paid form of communication, such as pay-as-you-go mobile telephone arrangements or time purchased in an internet café. Would it be open to ISPs to retain data indefinitely for national security purposes which could be disclosed during any relevant investigation? Could ISPs include ‘national security reasons’ in their notification to the Information Commissioner? The wording of this exemption does not confine its application to the security service or other government departments, but the extent and scope of the exemption and the activities and organisations to which it applies depend on the scope of the certificate issued by the relevant Minister. Certainly if ISPs were able to rely on this exemption by virtue of suitable certification, they could presumably be exempted from some or all of the data protection principles, specifically principle 5 requiring personal data not to be kept any longer than necessary for the purpose for which it was collected. However although legitimising data retention by a different mechanism, this would raise exactly the same proportionality concerns as are raised by the new retention rules and which were so extensively discussed in the Norman Baker case. Further, if exemption was also given from second principle, there would be no bar to further processing for another purpose which again would duplicate the provisions of the new regime.

5. The Privacy Concerns Raised by Blanket Data Retention Rules

The Joint Committee on Human Rights identified four matters for concern in their scrutiny of the draft code[48]; the fact that retention of data gave rise to obligations under Article 8 ECHR and that ISPs were not public authorities for the purposes of the Human Rights Act, the compatibility with the proportionality requirement if the interference of Art 8 rights was to be judged to be necessary, the subsequent availability of the data for other purposes and the adequacy of the consultation process. To this can be added the fact that a voluntary and not a mandatory code has been chosen.

National security could be viewed as a central objective of any government. The reason for data retention beyond the period necessary for legitimate commercial purposes is justified on these grounds, and yet the function is delegated to private enterprise on a voluntary basis despite the fact that ‘the rule of law implies that an interference by the executive authority with an individual’s rights should be subject to effective control …’[49]. A further insidious property of such a voluntary code is that, in a real sense, individual rights are being interfered with by a private body and not by executive authority. Although performing this task could be classified as a quasi-governmental function, there is no proposal at all to make ISPs functional authorities under the Human Rights Act 1998 s6(3)(b) for this purpose because ‘they would not be in a position where they would be carrying out a public duty’[50].

What incentive is there for an ISP to participate in this voluntary code? Both data protection and data preservation schemes would have significant compliance costs for ISPs and which are likely to be of much greater concern to the ISPs themselves than any potential infringement of human rights. Neither is there any doubt that compliance would incur a considerable burden as it has been acknowledged that ‘there is an accelerating trend in the industry either to reduce the period for which data are kept, or worse still, to stop retaining data at all. That trend is fuelled by the cost of retention and the diminishing need to keep data due to technological advances.’[51] AOL has estimated that retention could cost $40million to implement and $14 million to maintain. This figure is disputed[52]54, but it is any case unlikely that financial assistance which compensates for the true loss will be provided for those who do comply with the new rules. This serves to emphasis the apparent intention of both US and UK governments that ISPs should foot the bill for implementation of these schemes. There can presumably be no sanctions against those who choose not to adhere to a voluntary scheme, but the possibility of wholesale non-participation raises the spectre that it will be made compulsory if no-one adheres to it[53]. Co-operation and consultation then become a chimera. Paradoxically, though, there could be advantages for both ISPs and their customers if the code was to be made mandatory. As the situation stands at present, ISPs could find themselves subject to two opposing schemes; one which requires them to process their data with regard to the data subject’s rights including that of being informed of the purposes of processing and one which allows blanket retention of that data for potentially unknown and far-reaching purposes. The two could be reconciled by including a clause in the ISP’s contract with the client that traffic data would be retained for national security purposes to comply with the Code of Practice. This would at least put the data subject on notice that the data was held in this way and incidentally respond to the concerns expressed in the Norman Baker case about the dangers of blanket provisions. Without such a clause, there would be a danger that people would not know whether or not their rights had been infringed or were likely to be. As long as the Code remains voluntary there is little incentive for ISPs to draft such clauses, as customers might well prefer to take their business to a competitor who had not signed up to the code. A mandatory code would remove this problem and would be capable of providing more transparency for data subjects, even if they were not persuaded of the necessity for the long period of retention. In addition, although it is widely accepted that secrecy can be a necessary tool to protect national security, that is not to say that it is necessary in relation to all measures taken in pursuit of that objective[54], and there seems no reason for retaining all traffic data without the data subject’s knowledge.

Use of Retained Data for Other Purposes

In the debates in the Standing Committee the view was expressed that ‘the problem is that the Government have started to put together in a somewhat unstructured and ad hoc fashion, but not without a great deal of thought, a framework of control and access to communications data which completely undermines the purposes they so grandiloquently set out in the Data Protection Act 1998.’[55] Although most participants in the debate accepted and shared the concerns about the need to take measures to combat terrorism, it was pointed out that the measures which had been adopted were not confined to that objective but were a method of providing access to data ‘for a multiplicity of purposes – in fact, for every conceivable purpose of government.’[56]

The government, however, believes that there is no problem with retained data subsequently being used for another purpose as ‘the Government have always made it clear that it would be possible for data retained under the 2001 Act for the purpose of national security to be accessed for purposes other than national security under other pieces of legislation, including the Regulation of Investigatory Powers Act 2000. However the Government do not believe that disparity will make the retention or accessing of data unlawful’[57]. But it does fly in the face of the accepted tenet of data protection that data processed for one purpose should not be processed (i.e. accessed – disclosed or otherwise made available) for another purpose. Further, the data subject will be unable to ascertain, and probably also be unaware, of these potential further uses, access and disclosures. Although this might be viewed as inimical to the objectives of existing data protection rules, this could fall within the existing exemption for national security, as discussed above, if an appropriate connection with the prevention of terrorism or national security could be made out. Also, if the certificate under s28 was drafted appropriately, it might also allow access for other purposes – certainly the certificate which was at issue in the Norman Baker case allowed for ‘disclosing or disseminating such data to other United Kingdom government departments, agencies or public authorities’, although interestingly for the purposes of data retention, it did not provide exemption from principle 5. The government believes that such concerns are satisfied by the fact that it is only public authorities, as defined, who will be able to access the data. They are bound by ECHR provisions and so will need both to have due regard for art 8 and to comply with the requirements of necessity and proportionality on a case by case basis. In other words, the process of subsequent access and use would be considered on a case by case basis even though the original retention had been achieved by the blanket approach. Interestingly it has been suggested that the way to get robust legislation is to return to the DPA 1998 and reconsider its provisions as this would be a method of creating consistency, removing the potential disparities and creating more certainty for ISPs[58].

6. Issues of Risk, Risk Perception and Proportionality

The interaction of the social and political requirements which led to data protection regulation, the resulting legal framework for the protection of informational privacy, together with the political, social and legal constraints on the exemptions to the regulation, create an intrinsically complex system with correspondingly complex modes of failure. However, recent years have seen a move away from governmental attempts to understand why social and political systems malfunction with the resultant creation of schism and discontent. Instead, the perceived need for understanding is increasingly being replaced by a more simplistic analysis leading to a culture of zero tolerance and consequential control. The concept of the risk society might outwardly imply a rational assessment of risks and the devisal of a proportionate response, but it also masks a deeper felt need in the collective psyche. If there is a risk of harm, however slight, then every effort should be made to remove it – if there is a risk that that harm might be significant, then removal of that risk, by whatever means, becomes a social and political imperative. External threats to the security of the nation state have perhaps always tended to be regarded in this way – there is often an innate suspicion of other nationalities and groups who do not share the same creed or vision for the world which can lead to diminution of the usual methods of legislative scrutiny, especially at times of actual or perceived threat. Any reduction in scrutiny is given political legitimacy by the views of governments that they are merely protecting the nation from external threat. This is frequently accepted by the governed, even when it results in curtailment of their own rights, or a when the resulting rules also prove to have a draconian effect on the nation’s own citizens. A famous example of this from a previous era is, of course, the enactment of the Official Secrets Act 1911 which, in the light of the perceived threat of German espionage in the period leading up to the First World War, went through all its constituent stages and returned to the Commons in one afternoon. One lesson from the accelerated passage of this statute is that, although the primary objective was ostensibly to provide measures against espionage, s2 of this statute[59] was capable of significantly wider application. Over subsequent years it was used with draconian effect, not only on those who might be engaged in activities which threatened national security, but also on anyone who revealed government information for whatever reason, there being no defences provided under the Act[60]. The consequences of hasty action are further enhanced in the case of so-called ‘emergency legislation’ of which the Anti-terrorism, Crime and Security Act is, of course, an example[61].

This is not to say that all action taken pursuant to threats to national security is inappropriate and unnecessary, but that heightened perceptions can easily lead to the action being impulsive and unconsidered; as Lyon points out ‘tracking down the perpetrators of violence is entirely appropriate and laudable, reinforcing surveillance without clear and democratically defined limits is not.’[62] The polarisation of views can also mean that those who counsel caution are, themselves, then regarded with suspicion. On the other hand, the considered response is, itself, fraught with difficulty. What is the actual threat or risk? How should it be quantified? Where the risk of damage is significant then the likely expectation is that the action must not just match the threat but incorporate some safety margin – i.e. be more than is strictly necessary to counterbalance the threat. Equally, where the perception is that the threat is significant, then political pressure to take action may be almost irresistible. However, as pointed out by Talbot, it is possible for such action to be counter-productive and ‘short-term counter-terrorism tactical gains are only an illusion of effectiveness if the consequence of their imposition is an inflamed sense of grievance and injustice; sentiments that can sustain conflict and terrorism.’[63] Blanket data retention rules provide ‘unprecedented surveillance powers but are there really urgent and pressing counter-terrorism imperatives that require them?’[64] Is the current risk of terrorism now actually greater than in the past, or has perception been heightened by recent events and fuelled by media speculation and conjecture? Quite simply there has been no rational assessment of the numbers of terrorists who might be traced if all traffic data were retained. Would the apprehension of even one suspect justify the wholesale intrusion into the privacy of the millions of customers of ISPs? Some 20 years ago it was noted by the European Court of Human Rights that ‘democratic societies nowadays find themselves threatened by highly sophisticated forms of espionage and by terrorism …’ and that countering such threats could justify the ‘secret surveillance of subversive elements …’[65]. The very use of the word ‘subversive’ suggests prior or additional knowledge resulting in identification of such elements and thus more closely parallels the justification of the use of data preservation rather than data retention. Nevertheless, the Court did not accept that the German government had an unlimited or unfettered discretion to adopt whatever measures they wished in this regard ‘in the name of the struggle against espionage and terrorism’.[66]

The UK Government’s view is that ‘retention of communications data is both necessary and proportionate in the light of advice received from the security and intelligence services and from the police’.[67] The new data retention rules acknowledge that data, once retained, can be used for policing purposes other than those directly concerned with national security and terrorism. This aspect gives rise to genuine concern – what is the basis for allowing such an intrusion without the possibility of suitable counterbalancing safeguards to ensure proportionality in the conduct of more routine police work and investigation? If Bowden’s view that ‘traffic data constitutes a near complete map of private life’[68], is accepted, then the new rules could sanction far more extensive and routine intrusion into private life by the police, and perhaps other authorities, than at present. For ‘if you can no longer feel secure that your telephone, web surfing and electronic communications are in fact private, then that signals a major change in the nature of the society in which we are living.’[69] In the UK, and many other Member States of the European Union, the data protection standards embodied in Directive 95/46/EC are applied to processing for police purposes even though this is an area outside EC competence. This is particularly true for those states for which data protection has a constitutional basis. A study at the end of the 1990s, on the practical application of these principles to police work, provides a thought provoking prologue to the present situation. The study[70] notes that the directive is based on the notion of balance whereby the processing of personal data is an obvious interference with the data subject’s private life which has to be weighed against the purposes of the use. In relation to policing, this intrusion has the potential to be sufficiently serious that such a balance can only be achieved if the processing is necessary for the police purpose and not merely convenient or desirable.[71] The traditional role of the police in a democratic society was assumed to include the investigation and prosecution of specific criminal offences and the countering of real and immediate threats to public order. It was noted, however, that, from the 1970s onwards, what had previously been regarded as a more marginal activity, that of ‘preventive’ work, had expanded (especially in relation to threats of terrorism) and was reaching the same status as the other two purposes.[72] The research finding was that, in reality, the actual policing practices in most Member States did not conform to the data protection rules and principles and that this was particularly true in respect of this ‘preventive’ activity where the existence of three particular trends gave cause for concern; the collection of data on a wide range of data subjects not just those suspected, the use of more intrusive or secret methods of collection, the use of more intrusive means of data processing including ‘Rasterfahndung’ i.e. the examination of various databases not assembled for police purposes. This resulted in the conclusion that ‘policing in Europe at the end of the 20th and the beginning of the 21st century is more sophisticated, more intrusive, more secret and more centralised than at any time since the second World War’.[73] Despite these findings the overall conclusion was that ‘the establishment of a regime compatible with the Directive was urgently required in terms of existing constitutional or legal theory in many Member States’, neither was it thought that such a regime would be incompatible with effective police work citing the example of the Netherlands as an example.[74] In other words, there was no basis for relaxing the application of data protection rules for general police work, a result at odds with the tenor of the draft Framework directive referred to above.[75]

7. Conclusion

In the wake of 9/11, when addressing a joint session of Congress and the American people, President Bush made his now notorious assertion that ‘either you are with us, or you are with the terrorists’[76]. The identification of the abrupt discontinuity which this view generates makes it far easier for states to justify measures taken to address terrorist activity, even if they exceed what otherwise might be considered essential. Who can quarrel with actions taken for ‘good’ purposes when the only other alternative is ‘evil’? The potential harm that can be engendered by this type of approach has been described in the following terms[77]:

The legal ambiguity that surrounds the status of the ‘international campaign against terrorism and the Manichean terms in which it has been presented to the public (‘good versus evil’, axis of evil’, ‘civilisation versus darkness’, ‘with us or against us’) have created conditions in which it is unusually easy for political authorities to evade legal accountability. In an increasing number of countries, regulations have been reinforced, reinterpreted or suspended. New regulations have been introduced; and there is a greater willingness to consider covert action, in both the civil and military spheres. These measures are justified by governments in terms of the need to oppose terrorism; but many are likely to lead to human rights violations, either immediately or in the future.

In other words, human rights will be an inevitable casualty of government actions pursuant to the ‘war against terrorism’, and this effect will be compounded by the observed inherent problems of legislative responses driven by panic. This is no less the case if such measures apparently enjoy widespread popular support as a demonstration of robust leadership and evidence of effective government. ‘Strong’ government in this respect often results in rapid action without the appropriate consultation expected in a democratic society. Leone and Anrig, noting that many ‘knee jerk’ reactions of the US government to events through the 20th century subsequently backfired, suggest that ‘public deliberation entails controversy that can be painful and time-consuming, but it often prevents bad ideas from taking hold while broadening support for policies that are implemented’[78]and, specifically in the present context note that, as a reaction to 9/11, the US PATRIOT Act[79] was enacted within only 6 weeks and was accompanied by an almost complete lack of argument, public deliberation or dissent, even though it incorporates ‘sweeping changes in the ways that the government can monitor and investigate all citizens.’[80] A similarly cautionary opinion is expressed by Lyon, ‘panic responses … that both silence critical discussion and impose restrictions on civil liberties are likely to have long term and possibly irreversible consequences. They permit extraordinary ‘wartime’ measures which include appropriating data on everyday communications and transactions – phone calls, e-mail, the internet – while implicitly discouraging the use o these media for democratic debate.’[81]

Such ‘extraordinary measures’ will frequently threaten human rights by violating proportionality, a fact which may be acceptable for genuine and short-lived emergency measures but should not be tolerated in the long term. Civil liberties themselves have been described as ‘the product of considerable struggle in the modern era’[82]. The development of laws enshrining minimum standards of informational privacy protection provides an example of just such a struggle to gain an enhancement of human rights in the shape of individual autonomy with respect to personal data. The UK government has had a well-documented ‘love affair with secrecy’ and with the desire to control information. The appropriateness of this attitude in a modern democratic society has been subject to challenge in the last 20 years resulting in advances, not only in data protection law, but also in freedom of information and open government couples with a consequent enhancement of human rights and individual autonomy. In this regard, blanket data retention rules represent a retrogressive step which would considerably erode the achieved gains in individual rights without necessarily leading to significant benefits in the wider public interest. A return to the status quo prior to the advances of recent decades, and a consequent denial of the intervening gains in human rights, can be avoided by balancing the twin goals of appropriate privacy protection with the need to identify information necessary to take action against suspected terrorists or for the apprehension of offenders. Finding such a balance is clearly not a new issue. In 1970, Blom-Cooper highlighted the issue by remarking that ‘the law’s problem is to define with the utmost clarity the extent of police powers consistent with the preservation of privacy … it is all too easy to slip into loose talk about the ‘war against crime’ … Such talk is not conducive to a sound application of the principles of justice.’[83]

How is an appropriate balance to be achieved? Proportionality requires an assessment of the necessity of the measure in question and its suitability for achieving its objective and the consequent balancing of the resulting restrictions. New rules, of themselves, need not be inherently sinister or malign, but are they necessary? Is data retention crucial for the protection of national security? It can be argued that the answer to this is negative by the very fact that a voluntary code is in use. If retention is actually essential then there can surely be no justification for anything but a mandatory code. This, of course, ignores the fact that the reason may be the more prosaic one that the government knows that there is little likelihood of having to compensate ISPs who are economically adversely affected by a code which is voluntary, whereas there could be pressure to compensate them for the unfavourable effects of compliance with a mandatory code. Nevertheless, it is difficult to find proportionality in a measure which involves blanket retention of data with no provision for individual safeguards. The case of Peck v UK[84] has underlined the fact that a measure will not be proportional merely because there is an identified legal basis and a legitimate aim in the absence of sufficient safeguards for the individual. Arguably though, there is no need for a mandatory code to provide fewer safeguards for individuals as it would then be more feasible to inform data subjects of the collection of their data for this purpose and/or designate ISPs as functional authorities subject to the provisions of the ECHR/HRA. However, such a provision itself might be viewed as an unacceptable burden on ISPs, since requiring them to comply with HRA would impose the duty on them, rather than government, to show that the retention is both necessary and proportionate. This highlights the absurdity of imposing primary duties on private enterprises in pursuit of national security objectives which are a fundamental responsibility of government.

Data retention is an indiscriminate and clumsy tool which may, in consequence, lack the necessary precision to achieve the intended objective of retaining vital evidence. Its application may result in the amassing of vast quantities of data which may prove difficult to evaluate without extensive supporting analysis to extract the required information to assist in the apprehension of suspects. Neither do considerations of proportionality provide a basis for the justification of subsequent access for other purposes, since individual rights in cases where national security is not at risk can be dealt with appropriately by the existing data protection framework. As pointed out by the Joint Committee on Human Rights ‘It is not clear to us that Parliament … intended to affect the balance between rights, safeguards and the public interest in relation to access to communications data in cases which are unrelated to national security.’[85]

Interestingly, it is possible that no new enabling rules were necessary to accomplish the objective sought. Data protection laws aim to limit the uses and purposes of data collection and not the collection of data, per se. A similar result, but one which would probably have been much less visible, could have been obtained by an application of pre-existing law using the national security exemption of the Data Protection Act 1998 and the certification process. Application of the national security exemption with its potentially wide scope could certainly be used to permit data collection and retention on individuals who had been identified as a possible threat i.e. could sanction data preservation without the need for further legitimisation. In extreme cases this exemption could perhaps be used for more extensive data retention, subject to the proportionality tests set out in the Norman Baker case and specifically the need to consider each case on its merits. This could have obviated the need for hasty reactions which are more likely to suffer from being ill-conceived, inappropriate and sometimes even irrational. This is not to say that no action should be taken to combat terrorism, although in the current climate even reasoned critique risks being branded with the mark of the terrorist sympathiser. But, as Thomas remarks, ‘the rule of law, equality, proportionality and fairness are challenged by terrorists and also by ill-conceived carte blanche terrorist legislation’[86]. It is clearly beyond the scope of this paper to make any comment about the wider requirements of anti-terrorism legislation and no suggestion is being made that it is possible to extrapolate this analysis to the other ingredients of such statutes. However, in a more comprehensive study, it has been noted that ‘there is scant evidence that anti-terrorist legislation works to control terrorism.’[87]

Notes and References

[1] That is not to say that the result is without flaw only that the problem was anticipated before it manifested itself to any significant degree.

[2] See e.g. Edward V Long, The intruders: the invasion of privacy by government and industry New York: Praeger (1966), Alan F Westin ‘Science, Privacy and Freedom: Issues and proposals for the 1970s’ Part I ‘The current impact of surveillance on privacy’ (1966) 66 Col L Rev 1003 and Part II ‘Balancing the conflicting demands of privacy, disclosure and surveillance’ (1966) 66 Col L Rev 1206. Arthur R Miller ‘Personal privacy in the computer age: the challenge of a new technology in an information-oriented society’ (1969) 67 Mich L Rev 1091.

[3] Long, for instance, despite asserting that (above n. 4 p. 49) ‘we are living in the age of the dossier. Never before in our history have such quantities of personal data been collected by so many different groups about so many different people.’ nevertheless concentrates the main thrust of his analysis on surveillance and privacy rather than data protection as such.

[4] See e.g. Malcolm Warner and Michael Stine Data Bank Society London: Allen and Unwin (1970).

[5] Report of the Younger Committee on Privacy Cmnd. 5012 HMSO 1972 paras. 592-600.

[6] Treaty 108 http://conventions.coe.int/Treaty/EN/Treaties/Html/108.htm

[7] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data. 1995 OJ L281/31 also at http://europa.eu.int/comm/internal_market/privacy/law_en.htm.

[8] That these developments have also been influential in the US has been noted e.g. by Lyon who comments that ‘since the 1980s, if not before, … European initiatives have had an influence on what happens in North America and this is especially true of the ‘Data Protection Directive …’ David Lyon ‘Surveillance after September 11’ Cambridge: Polity Press (2003) p. 139.

[9] The statistics for prosecutions under the Computer Misuse Act presented by the Home Office to the All Party Internet Group reviewing the Computer Misuse Act show that not only are there a relatively small number of cases in total but also a conviction rate of only just over 50% in 2002 compared with a rate of 74% for all offences in the same year. See, in particular, www.apig.org.uk/Home Office - total conviction rates.xls and www.apig.org.uk/Home Office Appendices.xls. For comment on relevant enforcement issues see e.g. Andrew Charlesworth ‘Between flesh and sand: Rethinking the Computer Misuse Act 1990’ (1995) 9 LC&T Yearbook 31, Diane Rowland and Elizabeth Macdonald Information Technology Law 2nd ed. London: Cavendish (2000) pp 474-478, David Wall ‘Policing the internet: maintaining order and law on the cyberbeat’ Ch. 7 in Yaman Akdeniz, Clive Walker and David Wall eds. The Internet Law and Society Harlow: Longman (2000) p. 173, David Bainbridge Introduction to Computer Law 5th ed. Harlow: Longman (2004). Similar issues have been noted in the US, see e.g. JM Conley and RM Bryan ‘Computer crime legislation in the US’ (1999) 8 ICTL 35.

[10] See e.g. successive annual reports of the Data Protection Commissioner/Information Commissioner which detail actions which have been taken at a number of levels both under the powers given to the Information Commissioner and in the courts. If the more controversial view is accepted that more informal methods of ensuring compliance such as advice and guidance should be viewed as an aspect of enforcement by the Information Commissioner’s Office then the divide may be even greater.

[11] 2003 EWCA Civ 1746. The outcome of this case led Durant to make complaint to the European Commission that The UK was failing to comply with the directive and, as a result, the Commission has apparently implemented the initial stage of enforcement proceedings. It is understood that the Commission’s case centres on both the scope of the definition of ‘personal data’ and the lack of a definition of consent in the 1998 Act. see e.g. www.itspublicknowledge.info/newsletter8.htm, www.twobirds.com/english/publications/legalnews/EU_Commission_investigation_UK_data_protection_legislation.cfm.

[12] See e.g. Viktor Mayer-Schönberger ‘Generational development of data protection in Europe’ Ch. 8 in Philip Agre and Marc Rotenberg Technology and Privacy: The New Landscape Cambridge, Mass: MIT Press (1998).

[13] Article 12(3).

[14] See e.g. the comments of Lord Hoffman in Brown 1996 1 All ER 545, 555 and the assertion by the then Data Protection Registrar that ‘… data protection legislation is about the protection of individuals rather than the regulation of industry. It is civil rights legislation rather than technical business legislation …’ Tenth Annual Report of the Data Protection Registrar HMSO 1994 para. 2(a).

[15] See e.g. Recommendations of the Younger Committee Cmnd 5012 Paras. 592-600,Council of Europe Convention for the Protection on Individuals with regard to the Automatic Processing of Personal Data Treaty No. 108 http://conventions.coe.int/Treaty/EN/Treaties/Html/108.htm, OECD Guidelines on the Protection of Privacy and Transborder Data Flows of Personal Data http://europa.eu.int/comm/internal_market/privacy/instruments/ocdeguideline_en.htm.

[16] Widely defined see DPA 98 s 1(1).

[17] See Directive 95/46/EC Articles 6 and 7, Data protection Act 1998 Schedules 2 and 3.

[18] However the scope of some of the criteria in Schedules 2 and 3 is somewhat uncertain e.g. vital interests, legitimate interests pursued by the data controller etc.

[19] For definition see Data Protection Act 1998 s 1 and for criteria for legitimate processing of sensitive data, see schedule 3.

[20] The appropriate period for retention of details of police suspects is under scrutiny at the time of writing. See Bichard Inquiry Report following the Soham murders www.bichardinquiry.org.uk/report/ and see e.g. Nigel Wildish and Viv Nissanka ‘A deletion too far: Huntley, Soham and data protection’ (2004) 14 Computers and Law 28.

[21] Above n. 13. See also Guidance issued by the Information Commissioner following this case www.informationcommissioner.gov.uk.

[22] See DPA 1998 s 30 and orders made under this section.

[23] DPA 1998 s 33.

[24] S 28(2), the Minister must be a member of the Cabinet, the Attorney-General or the Lord Advocate. This is essentially a re-enactment of the national security exemption in the 1984 Act but an important new addition provides that anyone affected by the issue of such a certificate can appeal to the Information Tribunal (s28(4)).

[25] See e.g. Macgregor v Procurator Fiscal of Kilmarnock 23 June 1993 (unreported) and the facts of R v Brown 1996 2 WLR 203. On the other hand it can be argued that the investigation of the Soham murders might have been expedited more swiftly if Huntley’s data had been dealt with in a different way se above n. 22.

[26] Norman Baker MP v Secretary of State for the Home Department Information Tribunal (National Security Appeals) 1 October 2001 available from www.lcd.gov.uk/foi/bakerfin.pdf .

[27] For details of its content see ibid para. 23.

[28] Ibid para. 25.

[29] Ibid para. 66.

[30] Ibid para. 70.

[31] Ibid para. 76.

[32] Ibid para. 113.

[33] 1998 OJ L24/1.

[34] 2002 OJ L201/37 to be implemented by Member States by October 2003.

[36] It could be argued that specific data retention rules are unnecessary in the US in the absence of generic data protection legislation. The official US stance is certainly against mandatory destruction rules on the basis that traffic data may form an essential part of the evidence necessary to apprehend terrorists and criminals Post-September 11th, the US has enacted the US PATRIOT Act 2001 and the Homeland Security Act 2002 which, in this context, increase dramatically the available powers of electronic surveillance and interception. They are couched in similarly broad terms and are equally capable of unacceptable intrusion on the on-line privacy for the innocent. They allow law enforcement agencies to trace and record computer routing addressing and signalling information and to gain access to e.g. personal financial information purely on the basis that the information is likely to be relevant to an investigation. The effects of this statute have been described as ‘both the diminishment of personal privacy and the expansion of government secrecy.’ (Marc Rotenberg ‘Privacy and Secrecy after September 11’ (2002) 86 Minn L Rev 1115).

[37] Catherine Crump ‘Data retention: Privacy, anonymity and accountability online’ (2003) 56 Stan L Rev 191, 194.

[38] Hazel Blears, Minister for crime reduction, policing and community safety, House of Commons Standing Committee on Delegated Legislation 13 November 2003 Col. 004 and see comments of Brian White in Col 019.

[39] Caspar Bowden ‘CCTV for inside your head: Blanket traffic data retention and the emergency anti-terrorism legislation’ (2002) 8 CTLR 21, also published in 2002 Duke L & Tech Rev 5.

[42] Opinion 10/2001 adopted on 14 December 2001 (0901/02/EN/Final) see http://europa.eu.int/comm/internal_market/workinggroup/wp2001/wpdocs01.htm.

[43] See www.fipr.org/press/020911DataCommissioners.html and see also Opinion 5/2002 of the Article 29 Working Party on Data Protection, http://europa.eu.int/comm/internal_market/privacy/workinggroup/wp2002/wpdocs02.htm.

[44] Anti-terrorism, Crime and Security Act 2001 s 104 (1) and (2).

[45] See Retention of Communications Data (Code of Practice) Order 2003 SI 3175/2003.

[47] Compare e.g. the government view expressed by Hazel Blears in Col. 005 and that of Dominic Grieve in Col. 013 HC Standing Committee on Delegated Legislation 13 November 2003.

[48] Joint Committee on Human Rights Sixteenth Report 2002/2003 para. 7, although as noted in para. 4 of the report, the draft Code was not submitted for scrutiny until the draft order for bringing the code into effect had been laid before Parliament.

[49] Klass v Germany 1979-80 2 EHRR 214 para. 55.

[50] Hazel Blears HC Standing Committee on Delegated Legislation 13 November 2003 Col. 008.

[51] Blears ibid col. 007.

[52] ibid cols. 019 and 025.

[53] See also comments of David Kidney ibid Col. 009.

[54] See Norman Baker case above n. 28 para. 35.

[55] Dominic Grieve, HC Standing Committee on Delegated Legislation Col. 013.

[56] Ibid.

[57] Hazel Blears ibid. Col. 010.

[58] Richard Allan ibid Col. 022. However this was not considered to be a practical alternative because of the inevitable cost in time involved in revisiting primary legislation. The widely-held view was that that time was of the essence in relation to measures designed to deal with the threat of terrorism. However more than two years have now elapsed since the attacks of September 11th which prompted the measures in the first place and it is not obvious that the failure to implement the measures has caused major problems for the enforcement agencies or that they are unable to operate under the existing regime.

[59] Now repealed and replaced by the Official Secrets Act 1989 although whether this statute entirely remedies the problems observed with s2 is open to question. See e.g. Hanbury ‘Illiberal reform of s2’ (1989) 133 SoJo 587.

[60] Adverse criticism of a line of cases starting with the trial of Jonathan Aitken and culminating in R v Ponting eventually led to the repeal of s. 2. For further discussion see e.g. Patrick Birkinshaw Freedom of Information: the law, the practice and the ideal 3rd ed. London: Butterworths (2001) pp 115-121, Geoffrey Robertson Freedom the Individual and the Law 7th ed. London: Penguin (1993) pp. 158-167.

[61] For a review and discussion of some examples see e.g. Philip Thomas ‘Emergency and Anti-terrorist Powers’ (2003) 26 Fordham Int’l LJ 1193, 1200.

[62] David Lyon above n. 10. p. 1.

[63] Rhiannon Talbot ‘The balancing act – counter-terrorism and civil liberties in British anti-terrorism law.’ Ch. 9 in John Strawson (ed.) Law after Ground Zero London: Glasshouse Press (2002) p. 134.

[64] ibid p. 125.

[65] Klass v Germany above n. 36 para. 48.

[66] Ibid para. 49.

[67] Joint Committee on Human Rights 16th Report 2003/2003 para. 12.

[68] Above n. 41.

[69] Joe Meade, Data Protection Commissioner for Ireland 24/3/2003 www.dataprivacy.ie/7nr240203.htm.

[70] The feasibility of a seamless system of data protection rules for the European Union Luxembourg: OOPEC (1999).

[71] ibid p. 61.

[72] ibid pp 45-6 and referring to the Council of Europe recommendation No. R(87)15 ‘Regulating the use of personal data in the police sector’.

[73] ibid p. 47.

[74] Ibid p. 62.

[75] See above n. 42.

[77] International Council on Human Rights Policy Human Rights after September 11 (2002 ) p. 19. See also www.ichrp.org .

[78] Richard C Leone and Greg Anrig Jr. (eds) The war on our freedoms: civil liberties in an age of terrorism New York: PublicAffairs Books (2003) p. 2.

[79] See above n. 38.

[80] Leone and Anrig above n. 80. p. 2.

[81] Lyon above n. 10 p. 34.

[82] Ibid p. 43.

[83] Foreword to Alan Westin Privacy and Freedom London: Bodley Head (1970) p. ix.

[84] (2003) 36 EHRR 41.

[85] Above n. 50 para. 24.

[86] Above n. 63 at 1232.

[87] ibid p. 1233.

JILT logo and link to JILT homepage