Skip to main content

JILT 1997 (2) - Yaman Akdeniz et al.

Cryptography and Liberty:

'Can the Trusted Third Parties be Trusted ?
A Critique of the Recent UK Proposals'

Yaman Akdeniz Oliver Clarke Alistair Kelman Andrew Oram
University of Leeds CommUnity Telepathic Limited O'Reilly & Associates
lawya@leeds.ac.uk oliver@pigpen.demon.co.uk A.Kelman@lse.ac.uk andyo@ora.com

Contents
Abstract
How did we write this article?
1. Introduction
2. Cryptography and Encryption
  2.1 What is Encryption?
  2.2 History of Encryption?
  2.3 Kinds of Encryption?
  2.4 Historical vulnerability of encryption
3. Public Key Encryption
  3.1 RSA
  3.2 Pretty Good Privacy
  3.3 Trusted Third Parties
4. Why do we need Encryption
  4.1 Securing private communications
  4.2 Validation of users and documents
  4.3 Political Speech and Dissident Movements
  4.4 Development of Online Commerce
  4.5 Voice Communications
5. UK Encryption Policy
  5.1 White Paper--Introduction of TTPs
  5.2 The DTI Consultation Paper
  5.3 Replies and critique
6. US Government Policy
  6.1 Clipper Chip
  6.2 New Proposals for Legislation of Encryption Keys
  6.3 Challenges to US Encryption Policy
7. OECD Proposals / developments in the EU
  7.1 OECD Guidelines
  7.2 Developments within the EU
8. Would prevention of crime justify access to keys?
  8.1 Analogy of the Intereption of Communications Act 1985
  8.2 Examination of the 1985 Act
9. A Compromise Proposal
10. Conclusion
References
Links
  Materials
  Organisations

word icon and download article in .doc format Download

Abstract

Computer encryption is part of the basic infrastructure for modern digital commerce and communications. Recently it has been the subject of various proposals from the U.K. government, as well as governments in several other countries and the European Union as a whole. Whilst these proposals claim to address both the goal of improving commerce through better encryption and that of facilitating access to encrypted communications by law enforcement, the impact of the proposals is in fact to impair the former goal in order to favour the latter. They tend to call for 'key escrow' or 'key recovery' systems that centralise sensitive keys in databases (at 'Trusted Third Parties') and permit government access in a manner similar to that in which phone wiretaps are currently conducted. This paper examines several proposals, especially the March 1997 Consultation Paper from the Department of Trade & Industry entitled 'Licensing of Trusted Third Parties for the Provision of Encryption Services', and assesses their implications. We argue that key escrow represents an unprecedented intrusion on individual privacy, holds back the development of digital communications and commerce, and does not achieve the government's stated goals of helping to prevent crime. As an alternative, to address problems of law enforcement in electronic commerce and to facilitate the prosecution of crimes, we suggest a compromise proposal which we call 'key archiving.'

Keywords: communications, cryptography, digital communications, encryption, Internet, key archiving, key escrow, key recovery, law enforcement, privacy, security, Trusted Third Parties, free speech, anonymity, anonymous speech, key management


This is a refereed article published on 30 June 1997.

Citation: Akdeniz Y et al, 'Cryptography and Liberty: Can the Trusted Third Parties be Trusted? A Critique of the Recent UK Proposals', 1997 (2) The Journal of Information, Law and Technology (JILT). <http://elj.warwick.ac.uk/jilt/cryptog/97_2akdz/>. New citation as at 1/1/04: <http://www2.warwick.ac.uk/fac/soc/law/elj/jilt/1997_2/akdeniz/.


How did we write this article ?

'Cryptography and Liberty,' an article in The Journal of Information, Law and Technology, was co-written by authors in several different disciplines on two continents. Appropriately enough for a paper about the Internet, all drafts and discussion were carried on over electronic mail. It should be noted that the authors have never met in person and this did not stop them to develop a relationship which was based on 'trust' although the authors did not find cryptography itself to be necessary in their case.

The authors included a lawyer, a legal researcher, a police officer, and a book editor. All of them maintain an online presence to varying degrees (hosting Web sites and online political organizations, for instance) and have been following both the technological and legal aspects of their subject, computer encryption, for some time.

The impetus for the article came from a protest letter to the British government on which three authors collaborated. When author Yaman Akdeniz showed it to the editor of JILT, Dr Abdul Paliwala (for whom the author had written previously), he solicited an article on the subject.

Three authors live in the U.K., and offered a variety of legal and historical contributions. The fourth author, Andrew Oram, lives in the U.S., and therefore could provide details about encryption proposals from the President and Congress in that country. The logistics of writing and editing as four equals over electronic mail required the authors to adopt ad-hoc protocols, such as a round-robin process in which each edited the draft in turn.

Agreeing on key principles—such as the importance of free speech and privacy, the value of strong encryption in protecting those values, and the onerousness of government intrusion into citizen's rights—they still had to use electronic mail to work out differences in their views of current legal precedents and possible solutions. The main thrust of the article was to promote one traditional position on computer encryption (the civil libertarian position, which generally calls for the public's absolute freedom to use any form of encryption and to avoid the decoding of messages by the governments). But they felt the need to address the other traditional position (the law enforcement position, which calls for a weakening of encryption to the point where the government can decode messages to prevent crime and terrorism). Eventually they adopted a novel compromise developed by author Alistair Kelman, which they hope will give their paper a special status and help increase debate.

1.Introduction

The impact of Internet technology has raised many privacy issues and will be one of the greatest civil liberty issues of the next century. The Internet does not create new privacy issues, but it does make it more difficult to legislate effectively for existing ones, such as confidentiality, authentication, and the integrity of the information circulated. Privacy is a difficult topic for lawyers and legislators since it is necessary to balance rights and find a socially acceptable consensual solution.

On the one hand, citizens require the 'right to be let alone' [1] which was confirmed by Warren and Brandeis (Warren & Brandeis, 1890) [2] in their seminal essay which found that the common law implicitly recognised the 'right to privacy.' This was only a negative claim. During the century, different definitions moved from a negative claim towards a more positive right: a right to control the information about ourselves--be able to communicate the information or to keep it for ourselves.

According to Ruth Gavison, there are three elements in privacy: secrecy, anonymity, and solitude (Gavison, 1980, p 428). Although it has not been a fundamental and recognised right in English law, the need for some sort of individual privacy has been discussed several times. [3]

'Privacy is not a legal concept directly recognised as a human value, and there is no legal definition for privacy in the English legal system. The early case of Prince Albert v. Strange, (1849) 1 Mac & G 25, 41 ER 1171, 64 ER 293, which was the inspiration of Warren and Brandeis (Warren & Brandeis, 1890), recognised the right to an "inviolate personality" in the context of the plaintiff's right of property.' (Akdeniz, 1997b)

Article 8 of the European Convention on Human Rights states that:

'(1) Everyone has the right to respect for private and family life, his home and his correspondence.

(2) There shall be no interference by a public authority with the exercise of this right except such as is in accordance with law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.' [4]

but does not confer direct rights, including a right to 'privacy' within the English Legal System, and is not directly enforceable in England.

Now this may all change, as Britain's first law protecting personal privacy on a more general basis emerged as part of Queen's Speech on the 14th of May 1997. The initiative is a part of the new Labour Government policy of 'Bringing Rights Home to Britain.' New protection for privacy will arise through legislation which incorporates the European Convention on Human Rights into UK law. If that occurs, it would mean that a 'right to respect for a private life' will be part of the British law for the first time.

But in addition, individual privacy cannot be considered in isolation. Privacy must be weighed alongside freedom of speech and expression (Calcutt Committee Report, 1990, para 3.12, p 7). Cyber-Rights & Cyber-Liberties (UK) stated that:

'That is why privacy rights have developed better in the countries which have constitutional protection for freedom of speech, or the countries who have adopted the European Convention on Human Rights. Freedom of speech rights will also be incorporated into English law by the pending legislation.' (Akdeniz, 1997b)

A contrary process is working against privacy, however: A modern technological civilisation depends upon sharing information quickly and effectively-to give proper health care to citizens, to arrest and prosecute criminals, and to engage in commercial activities. Such information sharing tends to lead toward the collection of many types of information on individuals, and storage in centralized databases where snooping can compromise privacy. Balancing the right of privacy with the functional needs of society and civilisation creates a conundrum.

For most of this century the modern functioning of civilisation has, of necessity, eroded the privacy rights of ordinary citizens as government and private industry have built databases and technological systems of awesome power. Through the use of these systems, the weakest members of society have been better protected and served. But the history of this century--and in particular the misuse of personal information in Nazi Germany and in the USA under President Nixon and J. Edgar Hoover of the FBI--has shown how these systems can be misused.

Modern computer systems and telecommunications networks threaten to unbalance the equilibrium between society and the privacy rights of citizens. Their uncontrolled deployment would lead rapidly through 'Nanny State' to a totalitarian society where 'the right to be let alone' is lost. But a rebalancing is possible through the sensible general deployment of encryption to protect the privacy of citizens.

Privacy is a legal right in most countries, but it is a right that requires technological support. A public agency, for instance, does not rely on law to keep visitors out during off-hours, but places a lock on the door. In the growing world of electronic transactions, encryption plays the role of the lock. Its availability is critical for ensuring the right to privacy in a digital age.

Privacy can also be misused. Effective limits have been placed upon privacy by most governments in order to permit bugging, wiretapping, the interception of mail, and other methods of investigating crimes. Generally, the governments have left it up to the judicial structure to determine when the right to privacy should give way to the need to enforce public order.

Digital networks do not easily allow this dual situation where privacy is protected in most cases but legally breached where the public interest calls for it. Encryption technology is an all-or-nothing proposition. Either communications of all sorts are protected by encryption keys that are unbreakable during any reasonable time period, or all communications are relatively open to snooping by governments and others who find the technical means to do so. In this paper, the authors examine encryption technologies and the various legal and technical proposals that have been floated in Britain, the United States, and elsewhere to resolve the problem of making privacy only partially vulnerable. We also cite some precedents for proposed laws regarding encryption. Our conclusions are that complete regulation and control of encryption is technically impossible, and that governments can only weaken the rights and safety of all participants in the digital infrastructure by trying to require keys that law enforcement can break. However we also believe that there is a role for government regulation to help ordinary citizens use encryption safely when they are engaged in electronic commerce with each other, as opposed to when they are merely communicating with each other.

2. A Basic Introduction into Cryptography and Encryption

2.1 What is Encryption?

Encryption is the use of some means to disguise or obscure the meaning of a message. This is normally achieved by transposing or substituting the characters which make up the message. Encryption is accomplished by a method prearranged by the sender and recipient so that the message can be read only by the intended recipient, who alone possesses the key to unravelling its true meaning. According to Lance Rose 'encryption is basically an indication of users' distrust of the security of the system, the owner or operator of the system, or law enforcement authorities.' (Rose, 1995, p 182)

2.2 History of Encryption

Ciphers and codes have probably been in use ever since the development of written language. They were certainly in use as early as pre-classical times: There are examples of simple substitution in the Old Testament (Jeremiah 25.26, for example, where 'Sheshech' is written instead of 'Babel', using the 2nd and 12th letters from the end of the Hebrew alphabet instead of from the beginning). Transpositional codes were used in classical Greece, and Julius Caesar used a simple 4-letter rotation code. Ciphers have increased in complexity throughout history. During the Second World War, one of the critical developments was the cracking of German Intelligence's Enigma Code machine by the Allies' secret Ultra project. The Enigma Code machine was the forerunner of computer-aided encryption. In the 1970's, the highly effective Lucifer system was invented, using both substitution and transposition. In the late 1970's, the US government established DES (Data Encryption Standard) which uses 56-bit binary keys (with 72 quadrillion possible combinations), and this remains in use despite criticism that it is now vulnerable to today's high-powered computers.

2.3 Kinds of Encryption

There are several different types of encryption. But only one of them, 'substitution,' is used in modern computerised encryption, so for the purposes of this article we will ignore techniques such as 'Transposition' and the 'One Time Pad.' (Kahn, 1972)

In substitution encryption, the message is encrypted by substituting one character for another. At its simplest, this might involve simply transforming each character by a certain number of letters of the alphabet according to a pre-agreed scheme. In more complex schemes the key is not a constant rotation, but differs with each character, and involves much more than simple rotation. In some instances, the letters of the keyword are used to indicate which of a series of different alphabets should be used to effect the substitution, called polyalphabetic encryption. In computerised encryption, operations are typically carried out on blocks of letters rather than individual characters.

2.4 The main historical vulnerability of encryption

Historically, no matter how strong and complex the encryption scheme used, there was always a major problem with encryption – key management. The same key was used to encrypt and decrypt the message. This created a serious practical problem – how to exchange the key securely and in secrecy. It took until 1978 for a really elegant solution to this problem to be found, when Whitfield Diffie and Martin Hellman invented what is now referred to as Public Key Encryption using the Diffie-Hellman algorithm.

3. Public Key Encryption

Instead of employing a single key which is used both to encrypt and to decrypt the message, a Public Key system uses two keys. The first key, used to encrypt the message, is a public key, which is not secret. Indeed, its owner tries to publicise it as widely as possible. A second key, however, is kept completely private, and is the only means of decrypting a message encrypted with its public pair. It is axiomatic to the success of a public key system that the private key cannot be deduced or calculated from the public key. This method eliminates the need to have any secure channel by which to pass a key in the first place.

3.1 RSA

Developed in 1978, the most popular and formerly one of the strongest public key crypto system uses the RSA algorithm (RSA being the acronym of its inventors: Rivest, Shamir and Adleman). An RSA key pair is created by multiplying two randomly chosen and very large (100 digit) prime numbers to arrive at their product (the modulus) and working from there. The two prime numbers are used with the modulus to create the private and public keys. The key to RSA's strength is that it is far easier to multiply two numbers than it is to factor them. For example, it is easy enough to multiply 11 x 27 to arrive at 297. It would take far longer to work out that 297 is the product of those two numbers (and only those two numbers, since they are both prime numbers). RSA Laboratories currently estimate that a 512-bit RSA key could possibly be broken by a large company in a reasonable time, and a 1024-bit key might be broken by government-sized resources in a few months, but a 2048-bit key is well beyond current computer technology.

3.2 Pretty Good Privacy ('PGP')

Developed by Phillip Zimmerman in 1978, PGP is the foremost amongst public key software systems, and is used all around the world, despite attempts by the US government to prevent its being accessible outside the USA. This attempt was done by classifying the RSA Algorithm as munitions under ITAR (International Traffic in Arms Regulation), which banned the export of cryptographic software or equipment which generates keys more than 40-bits in length (22 C.F.R. ss 120-130). Zimmerman was under investigation by the US Government after one of his colleagues released PGP onto the Internet, but this action was discontinued by the FBI without any explanation in January 1996, after a 28 month Grand Jury hearing. (Wallace & Mangan, 1996, chp 2)

3.3 Trusted Third Parties ('TTPs')

Public key encryption, an elegant technology leaves one major problem: how does one correspondent know whether he has the right key for the other correspondent? If two individuals have a secure channel over which they can pass a key-for instance, by sealing a piece of paper or diskette in an envelope and sending it through the mail-they can then communicate in confidence. But if they wish to rely simply on electronic media, they have no such secure channel. No one can trust an e-mail message saying, 'Here is my public key,' because the very message containing that key may be sent by an eavesdropper. The problem arises whenever two people who do not previously know each other wish to communicate. It comes to the fore most often in online commerce, where a customer wants to know whether he can trust someone who is claiming to offer goods and is asking for payment.

Trusted Third parties ('TTPs') may be the solution that allows an initial contact to be made. If you and your desired correspondent both know an intermediary, and entrust it with your keys, you can obtain each other's public key and start communication. For worldwide communication, the TTP will probably be a large organization with the same public visibility, quality controls, and sense of responsibility as a bank; and in the case of online commerce it may very well be a bank. The precise duties of a TTP are the crux of the debate between civil libertarians and law enforcement concerning encryption.

4. Why do we need Encryption?

Throughout history, encryption has mainly been the preserve of governments, spies, and the military. The technical means to achieve secure communication easily were beyond the ordinary citizen. Few private individuals would spend hours painstakingly coding or decoding an encrypted message by reference to a code book or phrase, and since most financial transactions were based on cash or barter, no real need for encryption had arisen. It is computer technology that has brought encryption within the reach of everyman, and which has also created the need for it, not just for private communication, but also to protect confidential sources of information, to facilitate electronic banking and commerce, and a wide variety of other applications.

4.1 For securing private communications

A few moments thought shows that it is an obvious fallacy to suggest that only those with something to hide would want to use encryption of any sort and that it is merely a refuge for criminals. Encryption is akin to putting a letter inside an envelope or placing important papers in a safe and locking it. Privacy is an important component of civilised life in a free society. We each take advice and counsel from those around us. Society protects and needs the right to privacy in communication between lawyer and client, doctor and patient, and priest and parishioner relationships. In the Information Society, privacy of communication is needed not only for those groups but also between lovers, family, friends and colleagues.

4.2 Validation of users and documents

Strong encryption software such as PGP can provide many other benefits in addition to personal privacy, among them:

Time Stamping, which assists authors of material who wish to establish their intellectual property rights. This can be achieved by 'time stamping' a copy of the material with their private key and then lodging it with a third party.

Authentication, or proof of identity. Encryption software such as PGP provides a means of verifying the identity of the person with whom we are communicating. This has numerous applications beyond simple messages (e.g.: on-line banking, on-line access to services, local and national government agencies, etc.).

4.3 Political Speech and Dissident Movements

Cryptography may allow unprecedented anonymity both to groups who communicate in complete secrecy and to individuals who use anonymous e-mailers [5] over the Internet to hide all traces of their identity when they communicate (Froomkin 1995, p 818). In the last couple of years the Samaritans were known to use this kind of anonymous systems for communications over the Internet. Even the UK West Mercia Police used this kind of anonymous e-mailers as the basis of their 'Crimestoppers' scheme. In the UK, we are lucky enough to have a society which is generally tolerant of dissension and political opposition, but this is far from being universally true. In many countries, these basic freedoms are not tolerated. In such places, encryption (along with anonymity) is one of the few means of preserving freedom of speech, thought and association. Even in the UK, few people will 'blow the whistle' on malpractice or corruption for fear of their communication being intercepted by the perpetrators. Akdeniz states that:

Anonymity is important both to free speech and privacy on the Internet. [6] Key escrow and the clipper chip threatens this kind of anonymity on the Internet. The government agents will be able to identify the content of e-mails and the destination of the messages. (Akdeniz, 1997a)

4.4 For the Development of Online Commerce

With the move from physical assets to an information society where intellectual property is the main asset of a company or a person, electronic networks are increasingly used to exchange sensitive business information (product details, business plans, etc.) internally and with partners. It can critically help competitors if they can illicitly intercept such information. Companies need to use encryption to control sensitive information supplied to teleworkers from leaking when the teleworkers change jobs or are careless.

On the Internet, secure communication is required to facilitate the transmission of financial details such as PIN numbers, credit card details, and the electronic transfer of funds, the so-called e-cash. Communications between companies are increasingly using this route, and these also need to be private and secure. Banks and multinational companies already use strong encryption to pass financial details and transactions, and to protect the PIN Numbers of their clients at ATM (hole-in-the-wall) machines. Secure communications are a prerequisite for full commercial use of the Internet, and development of these systems is currently a major area of software development world-wide.

These are important issues, and commercial use of the Internet will not become commonplace until most of the security problems have been solved. Encryption is an invaluable asset to companies doing business on the Internet, and to people wishing to purchase goods and services via that medium. Internet commerce requires a secure means of passing credit card details and other private or sensitive materials.

A proposal by the UK Department of Trade and Industry ('DTI'), which will be discussed in depth later in this paper, implies that companies will not resort to 'best practice' unless forced to do so. It is clear, however, that developing secure methods of doing business over the Internet is a matter of commercial necessity for those companies, and it is by no means clear that what is good for the authorities is best practice for everyone else. These pressures are already driving the development of better security measures and safer means of completing financial transactions on the Internet.

4.5 Voice Communications

Mobile phones are continually increasing in popularity, but it is not difficult to listen in on these conversations with equipment purchased cheaply on the High Street. Strong encryption is coming into use over GSM mobile networks and will increasingly be used to make terrestrial voice networks more secure.

5. UK Encryption Policy

Before examining UK encryption policy, we would like to remind the readers that encryption can develop with no policy at all (or rather a hands-off policy). Encryption has been made cheap and secure through technical advances, and may spread naturally through market forces because its value is clearly perceived by the public, but this would depend on the relaxation of strong export controls by various governments as 'encryption tools' are considered to be 'arms' under various legislation. People who need to communicate with strangers will want Trusted Third Parties ('TTPs'), therefore, these will spring up quickly to meet the need. There is a quality problem with TTPs, but the same is true of financial institutions and many other organizations in society. So some form of regulation or licensing may be called for, as in these other industries, but that would be entirely separate from regulating the technology of encryption. It is not the everyday commercial use of encryption, but potential criminal use that creates concerns with governments and leads to proposed regulation of the technology.

The growth and increasing availability of strong encryption techniques inevitably means that those same methods are also available to further crime, terrorism and espionage. Governments have a duty to protect their national interest and to facilitate law-enforcement investigations. There is no doubt that the general deployment of strong encryption in society poses a threat in this respect – although whether it is a serious threat to law enforcement is questionable. Law enforcement uses informers, electromagnetic monitoring, physical eavesdropping, 'social engineering,' and 'dumpster diving' to combat illegal activities. These will remain their major weapons.

Another possible motive behind the proposals includes the prevention of large-scale tax evasion. This is not mentioned specifically in the DTI proposals (below), but must be at the forefront of any government's mind: electronic commerce is set to grow exponentially over the next decade, and encryption undoubtedly increases the scope for hiding taxable transactions from national tax authorities.

Where the Internet is concerned, the difficulty of cracking encrypted messages is compounded by the ease with which a criminal can disguise the fact that a file is encrypted in the first place. For example, the encrypted message can be compressed using a variety of utilities, or converted into an image. It would then be indistinguishable from any other compressed file or image. Thus automatic monitoring of Internet traffic would require vast resources in order to be able to check each binary file for the possibility that it contains encrypted material in disguise. Manual monitoring and checking of anything more than a purely nominal portion of Internet traffic would be completely impossible. Given the levels of Internet traffic now, and projections of future trends, it is difficult to imagine any country having the necessary resources to monitor or analyse all Internet traffic in this way..

5.1 White Paper--Introduction of TTPs

In 1995 the UK Government's position was that they had no intention of legislating against data encryption. In 1996, the G7 Summit considered the threat posed by criminal and terrorist use of strong encryption. Following the G7 communiqué, the UK government, in June 1996, announced support for key escrow (recovery), in the form of a system of Trusted Third Parties ('TTPs'). This was ostensibly aimed at protecting the commercial sector, whilst giving the authorities some ability to obtain decryption where deemed necessary. The proposal mixed two separate justifications for trusted third parties, one appealing to private organizations and the other to law enforcement officials.

So far as private organizations are concerned, the concepts of key escrow and trusted third parties mean that certain organisations would be licensed to hold other individuals' or companies' decryption keys 'in escrow.' In a public key system, organizations may well choose to escrow only the public key. After all, the public key is enough to ensure secure communication. Why divulge a private key to a third party, where a break-in or unauthorized release could erase all security? However, it is conceivable that some companies and individuals would allow the TTP to hold the private key as a backup to their own key security, so that decryption would still be possible in the event of disaffection, death, or illness of an employee. The Trusted Third Party is valuable to users if someone loses all copies of his private key, dies with sole knowledge of a key that protects data needed by heirs, or learns that the key is stolen and wishes to quickly disseminate a new key.

For law enforcement officials, however, the advantage to TTPs would be a centralised key management system to which the law-enforcement agencies would have ready access. For this purpose, it is absolutely required that the TTP hold the private key. While the US and UK governments have liked to put foward both justifications in their defense of key escrow, the differences in the implementations of the two conceptions show that they are not addressing the same issues. First, many private organizations would never escrow their private keys. Second, the private sector's objectives in their use of trusted third parties could be best covered by a diversity of small third parties, which present less attraction to malicious intruders than the few, large third parties that are preferred implementation by the government system. Third, the government system explicitly requires the release of keys without the knowledge of the user, eliminating an important check on abuse of the system.

5.2 The DTI Consultation Paper

In March 1997, the Department of Trade & Industry released detailed proposals along the same lines of promoting key escrow in a consultation paper entitled 'Licensing of Trusted Third Parties for the Provision of Encryption Services.' These proposals would not make use of TTPs compulsory, but the provision of 'encryption services' by unlicensed organisations or individuals would become illegal. The DTI has seen this initiative as a means of promoting and enabling secure communications for commercial transactions on the Internet and elsewhere, as well as giving law-enforcement agencies a means of obtaining decryption via a warrant served on the licensed organisation.

The DTI proposal could easily leave readers confused about the far-reaching effects of legal restrictions on TTPs. Encryption of any type is permitted by the proposal, but only if the keys are not escrowed. In other words, if an organization wishes to be a trusted third party (and thus escrow keys), it must be willing to surrender keys to the government. As we have shown, trusted third parties are critical for general commerce and for any other communication between far-flung strangers. Therefore, most users will end up within the government system, and the DTI will have achieved its goal of eliminating secure encryption (secure in the sense that keys are not available to the government). Criminals, ironically, may go through the trouble of using secure encryption, while ordinary citizens would depend on insecure TTPs.

5.3 Replies and critique

The DTI's Consultation Paper invited views from all concerned parties, though the forward written by the ex Minister Ian Taylor clearly indicated that the ex UK government was looking into the industry to co-operate rather than the public at large. The consultation paper prompted immediate responses. The main criticisms were:

  • The proposals would not achieve anything for commerce which it could not develop for itself. The development of strong encryption techniques and appropriate security measures are an economic necessity for any commercial organisation using the Internet.
  • The licensed organisations acting as TTPs will become a prime target for intrusion, espionage, hackers and corruption, and the system will be far less secure than one where each person or company is responsible for their own security.
  • It is unlikely that law-enforcement and counter-terrorism will benefit significantly from the proposals, since criminals and terrorists are hardly likely to pass their private keys to anyone, and the penalties for illegal provision of encryption services will certainly be less severe than those for murder, causing explosions, major fraud, etc.

Key Escrow can be a double-edged sword. During a trial, defence lawyers could well raise doubt in the jury's mind that the communications introduced as evidence were really sent by the defendents. The defence may claim, as they did in cases involving the Bridgewater Four and the Birmingham Six, that the government set up the defendents with forged evidence.

If these criticisms are soundly based, then the proposals would achieve little in any positive way, but would make private use of encryption far more difficult. It is the average citizen who would be denied privacy and inconvenienced rather than those to whom the proposed legislation is allegedly addressed: criminals and terrorists using encryption to hide their criminal activities.

Action-Global, a world-wide association of organisations active in the area of civil liberties on the Internet, issued a Global Alert following publication of these proposals, which concluded:

'Government attempts to impose key escrow are likely to eliminate privacy for the average citizen of the average country when communicating using telephones or computer-mediated networks. The rights of free speech, free association, personal privacy, financial privacy, private property and doctor-patient and lawyer-client privilege, would all be weakened or eliminated. The role of digital transactions in our future is too important to permit such risks.' (Oram, 1997)

The UK group CommUnity has issued a critique of these proposals which maintains that:

'They impose a centralised, insecure & vulnerable system instead of the secure, robust, distributed systems already in place or in the process of development... ...In any civilised society, the authorities have the right and the duty to act in defence of the national interest and to secure observance of the law. This power, however, must be balanced against fundamental rights, such as freedom of speech, the right to privacy, and freedom of association. In pursuing the interests of the authorities, the DTI's proposals go too far down the road towards a complete denial of those basic rights.' (CommUnity UK)

Ross Anderson, one of the leading cryptographers in the UK, believes the proposed system is too centralised, and will hinder rather than help law enforcement in combating crime. According to Dr. Anderson the DTI initiaitves will not achieve the wide international acceptance its authors hope, will allow too much potential for fraud and corruption, and will wipe out the UK's native cryptography sector, which is largely comprised of small companies. (Daily Telegraph April 1, 1997)

By attempting to deal with two different issues using a single solution, the DTI proposal ends up satisfying nobody. The centralised TTP with a government trap door (which is the implication of the proposal) threatens to undermine private sector security and gives the wrong signals regarding government support for privacy rights. It damages the sales prospects of UK software houses who are developing world class encryption products, which under the DTI proposal could not be used for electronic commerce in the UK market, and which must compete with competitors abroad who will not have to labour under these restrictions.

6. US Government Policy

The Constitution of the United States seems at first glance to restrict the activities of US law-enforcement agencies more than their European counterparts. Constitutional challenges to other legislation, such as the indecency provisions of the Communications Decency Act of 1996 on the grounds of breach of the First Amendment, have often rendered them ineffective (ACLU v Reno, 929 F. Supp. 824 (1996), 117 S Ct 554 (1996)). However, privacy is not an explicit right under the US Constitution. The Fourth Amendment guarantees 'the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.' In interpreting this amendment, the US Supreme Court asserted a right to privacy, which also affected state laws. [7]

Past regulations and legislation relating to encryption have concentrated mainly on restricting the export of strong encryption methods and equipment, such as the International Traffic in Arms Regulation ('ITAR'--22 CFR ss 120-130) the Arms Export Control Act ('AECA' -22 USC ss 2751-2796d), rather than imposing any sort of restrictions on domestic use of encryption. Control of domestic use of encryption, however, has clearly been an issue for the US Administration since the inception of Clipper four years ago. The impetus that led to Clipper harkens back even further to digital wiretapping proposals from the National Security Agency ('NSA'), the FBI, and the Bush Administration, which eventually became law in the 1994 Communications Assistance to Law Enforcement Act.

6.1 Clipper Chip

Denials from the DTI aside, what has caused the most concern in Britain and Europe is overall similarity between the DTI consultation paper and a series of proposals put forward by the US government. The FBI maintains that effective law-enforcement is impossible without the authorities having the ability to wiretap at will, and strong encryption threatens that position. In April 1993, the US Government proposed the widespread commercial adoption of a form of 'hardware' encryption via a chip (called the Clipper Chip) to which the authorities would be able to obtain the key (and so could decrypt at will) [8] .

The original Clipper proposal, if passed into the statute book, would have meant that every single modem, telephone, fax machine or other piece of communications equipment manufactured or sold in the United States would have had to carry a Clipper chip, to which the US authorities would have had easy access via their 'back door key.'

Because of the storm of protest from US public interest groups and unease at the proposals from abroad, the Clipper proposals were amended in December 1995 and again in May 1996. The latest incarnation of these proposals now uses the term 'key recovery' rather than key escrow, although its intent is the same. The proposals would permit the development and export of other encryption methods (but with a low limit on key length) as long as the producers also commit to developing methods by which the US government can obtain a 'back door key.' Any systems which fail to deliver this would become illegal in due course. The Center for Democracy and Technology ('CDT') in the USA has, along with the Electronic Frontier Foundation ('EFF'), led the fight against Clipper in the US.

On May 21, 1997, a group of leading cryptographers and computer scientists [9] released a report in the USA which for the first time examined the risks and implications of government-designed key-recovery systems. The report entitled 'The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption' cautions that 'the deployment of a general key-recovery-based encryption infrastructure to meet law enforcement's stated requirements will result in substantial sacrifices in security and cost to the end user. Building a secure infrastructure of the breathtaking scale and complexity demanded by these requirements is far beyond the experience and current competency of the field.'

Drawing a sharp distinction between government requirements for key recovery and the types of recovery systems users want, the report found that government key recovery systems will produce:

1. New Vulnerabilities And Risks--Key recovery systems make encryption systems less secure by 'adding a new and vulnerable path to the unauthorized recovery of data' where one need never exist. Such backdoor paths remove the guaranteed security of encryption systems and create new 'high-value targets' for attack in key recovery centers.

2. New Complexities--Key recovery will require a vast infrastructure of recovery agents and government oversight bodies to manage access to the billions of keys that must be recoverable. 'The field of cryptography has no experience in deploying secure systems of this scope and complexity.'

3. New Costs--Key recovery will cost 'billions of dollars' to deploy, making encryption security both expensive and inconvenient.

4. New Targets for Attack--Key recovery agents will maintain databases that hold, in centralised collections, the keys to the information and communications their customers most value. In many systems, the theft of a single private key (or small set of keys) could unlock much or all of the data of a company or individual.

All these critiques apply equally to the DTI proposals in the U.K. In fact, the time period allowed by the DTI for TTPs to reply to government demands for keys-one hour-is even shorter than the two-hour period proposed in the U.S., and would make it very hard to properly verify the demand and securely deliver the key.

6.2 The New Proposals for Legislation of Encryption Keys in the USA

More recently, however, the Clinton Administration has moved to try to gain a measure of control over domestic use as well: Legislation has been drafted which would create a government-dominated 'key management infrastructure' which would be a prerequisite for anyone involved in electronic commerce. People would be compelled to use 'key recovery' as a condition of participating in the key management infrastructure and the legislation would require the disclosure of private keys held by third parties. Unlike the UK DTI's proposals, however, such disclosure could be compelled without a court order and merely on the written request of any law-enforcement or security agency.

The DTI paper can be seen to be following the US lead in these matters. Many of the details differ, particularly the procedure for obtaining disclosure, but the basic concept remains the same.

6.3 Challenges to US Encryption Policy

It would be wrong to think that the US Administration's view is not being contested in the USA. There are Bills in the process of passing through both the Senate and the House of Representatives, which seek to relax the export controls on encryption and to promote the access to and use of strong encryption techniques. These are 'The Promotion of Commerce Online in the Digital Era (Pro-CODE) Act of 1997,' S.377, and 'The Security and Freedom through Encryption (SAFE) Act of 1997.' These bills may stand a better chance of reaching the statute book than the Administration's proposals, which as yet do not appear to have the support to push them through the Senate.

It is also important to note that the above mentioned US export control regulations such as AECA and ITAR are challenged in the US Courts on the ground that they are unconstitutional and violate First Amendment right to free speech. There are three current similar cases:Karn v. US Department of State and Thomas E. McNamara, Bernstein v. Department of State, and Junger vs. US Department of State. In a move aimed at expanding the growth and spread of privacy and security technologies, the Electronic Frontier Foundation ('EFF') sponsored the Bernstein case, which seeks to bar the US government from restricting publication of cryptographic documents and software. EFF argued that the export-control laws, both on their face and as applied to users of cryptographic materials, are unconstitutional. EFF believes that cryptography is central to the preservation of privacy and security in an increasingly computerised and networked world.

7. OECD Proposals and the developments within the EC

7.1 OECD Guidelines

The Organisation for Economic Co-operation and Development ('OECD') issued revised guidelines on Control of Encryption in March 1997. Earlier drafts of these guidelines, prepared by that organisation's Council, had favoured 'key escrow' as a means of controlling encryption. The final draft was watered down considerably following pressure from some of its members, and direct references endorsing 'key escrow' were omitted in favour of more general recommendations. The guidelines accept that governments have a right to act in defence of their national interest, but two key paragraphs stand out here:

PRINCIPLE 2: Users should have a right to choose any cryptographic method, subject to applicable law

PRINCIPLE 5: The fundamental rights of individuals to privacy, including secrecy of communications and protection of personal data, should be respected in national cryptography policies and in the implementation and use of cryptographic methods.

Principle 2 is qualified with the following comment:

Government controls on cryptographic methods should be no more than are essential to the discharge of government responsibilities and should respect user choice to the greatest extent possible. This principle should not be interpreted as implying that governments should initiate legislation which limits users choice.

The OECD Guidelines endorse action by Governments to protect their national security and economic well-being, but these guidelines fall far short of endorsing what the UK and US governments are currently proposing. It is very possible that many governments are afraid of the situation that would obtain if 'key escrow' systems are implemented, particularly any variation of the Clipper proposals. Because of the pivotal position of the United States in international communications networks, particularly the Internet, Clipper would leave the US Government uniquely poised to intercept and decode communications worldwide. The caution shown by OECD members will undoubtedly have been exacerbated by the allegations last year that the US secret services hacked into the computer networks of the European Parliament and Commission during the GATT negotiations in order to obtain details of Europe's negotiating strategy (the Sunday Times, August 4, 1996).

7.2 Developments within the European Union

The European Commission has proposed a project to establish a European network of trusted third parties under the control of member nations, which is parallel to the UK proposals. The EC scheme does not suggest that key escrow should be mandatory.

In September 1995, the Council of Europe issued a Draft Recommendation No R(95): Concerning Problems of Criminal Procedural Law connected with Information Technology. This document goes further than the subsequent OECD Guidelines in that it implies that the interests of Law-Enforcement outweigh those of the private individual or the public generally and concedes that it may become necessary to place restrictions on the possession, distribution or use of cryptography. This is far closer to the position in France, where private use of encryption is completely outlawed, and encryption (including software) is second only to nuclear weapons in the list of dangerous munitions.

The Bangemann Report to the European Commission also makes a clear statement on the use of encryption tools. It states that a solution at a national level will inevitably prove to be insufficient, because communications reach beyond national frontiers and because the principles of the internal market prohibit measures such as import bans on decoding equipment. The Bangemann Report suggests that a solution at the European level is needed to provide a global answer to the problem of use of encryption, but the current debates on the use of encryption tools show that even a solution at the European level may well be insufficient for promoting government agendas.

8. Would prevention of crime justify access to keys?

8.1 An Analogy with the Interception of Communications Act 1985

Under section 1 of the 1985 Act:

' ..... a person who intentionally intercepts a communication in the course of its transmission by post or by means of a public telecommunication system shall be guilty of an offence ...'

Under section 1(3), a person shall not be guilty of an offence if:

'the communication is intercepted for purposes connected with the provision of postal or public telecommunication services or with the enforcement of any enactment relating to the use of those services.'

It is possible to make an interception when a warrant is issued by the Secretary of State, or one of the parties to the communication has consented to the interception (Interception of Communications Act 1985, section 2).

The decision in Malone v. Metropolitan Police Commissioner (No.2) [1979] 2 All ER 620, together with the privatization of telecommunications services in the Telecommunications Act 1984 [10] led to the enactment of the 1985 Act. [11]

The Secretary of State may issue a warrant which permits a named individual to intercept such communications as are specified in the warrant. The warrant may also specify the manner of disclosure of the contents of the intercepted communication and to whom it should be disclosed. Such warrants can only be issued on one of three grounds under section 2 of the 1985 Act which states that:

The Secretary of State shall not issue a warrant unless he considers that the warrant is necessary:

(a) in the interests of national security;

(b) for the purpose of preventing or detecting serious crime; or

(c) for the purpose of safeguarding the economic well-being of the United Kingdom;

The Act uses exactly the same criteria as the article 8 (2) of the European Convention on Human Rights, but such phrases as 'national security' and 'economic well-being' are not further defined in the Act, thereby giving it a potentially wide scope. Although any Secretary of State can issue such warrants, it is normally left to the Home Secretary, Secretary of State for Scotland, Secretary of State for Northern Ireland, and the Foreign Secretary.

8.2 Examination of the 1985 Act and telephone tapping in the UK

The number of warrants issued under the 1985 Act is also an important factor to reflect when determining whether such legislation should be introduced under the DTI proposals. According to Statewatch, 'the number of warrants issued in England and Wales for telephone-tapping and mail-opening reached its highest level for five years with 910 warrants issued in 1995 compared to 473 in 1990. The total number of warrants, covering phones and letter-opening, signed by the ex Home Secretary Michael Howard were 997 in 1995. Each of these warrants issued can cover more than one phonelines if they are issued to cover an organisation or group.' [12]

Whilst interception is increasing in the UK, it is clear from these figures that its use is hardly widespread. It is useful only where a specific target is known to the authorities. Nevertheless, the increase in mobile telephone communications and the advent of encryption methods becoming available to make such communications more secure is undoubtedly one of the reasons behind the Department of Trade & Industry's recent proposals. Because telecommunications is centralised, in the sense that the airtime provider or telephone company has a direct hand in providing the services, and will presumably have to be involved in the provision of secure communication via encryption, this area is relatively easy for the authorities to control. Large telecommunications companies can be relied on to co-operate with court warrants because of the economic pressures which could otherwise be applied to them. The system put in place by the Interception of Communications Act would be far less effective if that were not the case. There is no reason to suspect that, those companies would be any less co-operative with the law enforcement bodies where encrypted communications were concerned if they were in the position of trusted third parties.

In attempting to control encryption on the Internet, however, the authorities face a number of problems:-

The Internet is inherently anarchistic, in the sense that there is no central control on which communication depends. The role of the telecommunications company is taken by a myriad of service providers, universities and companies who make up the infrastructure of the Net. The provision of secure communication is in the hands of the user alone, who can encrypt his data at source using software products such as PGP.

Encrypted data communications can be disguised in a variety of ways so that it is not apparent that the data is, in fact, encrypted. Techniques for hiding a message have a long and honorable history (including, for instance, a method invented by Francis Bacon based on the slight darkening of particular letters in an otherwise innocent message) and are known as steganography. [13] On computers, these methods include:

The use of any one of a huge number of file compression utilities. These effectively encode the data a second time whilst reducing the file size.

Making an 'image' of the data (either before or after encryption). This process increases the file size considerably, but also effectively applies a second level of encoding to disguise the use of encryption

These methods are not encrypting the message as such, but they all (either individually or in combination) make it difficult or impossible to detect the transmission of encrypted data. Given the mind-boggling size of daily Internet traffic, it is almost inconceivable that manual monitoring of this mass data is within the reach of any government, and the methods outlined above will defeat automatic monitoring, for the most part.

As with voice communications, knowing who to listen to is a critical advantage for the law-enforcement community. Given the worldwide availability of products such as PGP, however, this advantage is nullified, since criminals and terrorists are hardly likely to use any encryption tool with key escrow or trusted third party scheme.

More to the point, if private use of encryption were outlawed, the penalties would be trivial compared to those prescribed for the far more serious offences in which those individuals were engaged. The maximum penalty for illegal interception of communications, for example, is only two years imprisonment on indictment, hardly a deterrent to someone engaged in large-scale fraud, robberies or terrorism, for which the penalties are far greater. It is unlikely that a non-violent, victimless offence in the illegal use of encryption would attract a greater penalty than this.

Thus, the French attitude towards encryption can be seen to be unenforceable and likely to become increasingly untenable (Koops, 1997). The DTI's proposals reflect this to some extent, even though giving law-enforcement agencies some ability to access encrypted communications is one of the principal underlying motives behind the paper. As with other areas of legislation, the restrictions imposed will fall mainly on those law-abiding citizens in whom the authorities have no direct interest. Criminals, on the other hand, will easily evade or ignore these provisions. We should also remember that government access to encryption keys, as in the case of the use of Closed Circuit Television systems ('CCTVs') will not necessarily prevent premeditated brutal terrorist attacks such as the Lockerbie Pan AM 103, Docklands (near the Canary Wharf) and Manchester's Arndale shopping centre bombings. While CCTV may catch sudden minor 'street crimes' we believe that it is mistaken to believe terrorists will prevent the placing of bombs in places watched by the CCTVs or that terrorists will plan their bombings using encryption tools which may be accessed by the law enforcement bodies. CCTVs did not prevent bombings in, for example, Manchester or recently at the Leeds City Station. It takes an extraordinarily high level of constant surveillance and oversight to provide an effective deterrent through these means. More likely is that the terrorists will use encryption without detection or detection will come later through other more conventional means, by which time the refusal to provide the key will be incriminating evidence.

The Labour Party recently argued that:

'It is not necessary to criminalise a large section of the network-using public to control the activities of a very small minority of law-breakers.' (The Labour Party Policy on Information Superhighway)

It seems that Labour Party intends to penalise a refusal to comply with a demand to decrypt under judicial warrant. [14] Even if this proposal is never enacted, the courts may draw inferences under the new sections 34-37 of the Criminal Justice and Public Order Act 1994 because of the silence of the defendants (Akdeniz, 1997a). [15] Lord Slynn in Murray v. DPP 97 Cr. App. R. 151 stated that:

'If aspects of the evidence taken alone or in combination with other facts clearly call for an explanation which the accused ought to be in a position to give, if an explanation exists, then a failure to give any explanation may as a matter of commonsense allow the drawing of an inference that there is no explanation and that the accused is guilty.' (at 160)

Not providing an encryption key may result with judges commenting on the accused's behaviour and juries drawing inferences under the new controversial 1994 Act. [16] The interception of messages is important, but it should be remembered that terrorists and organised criminals are detected through a variety of techniques involving mainly informers and surveillance. [17]

9. A Compromise Proposal

The solution to the dilemma of key escrow, where one direction leads to the risk of unsolved crimes and the other to the abuse of surveillance capabilities, may lie in delineating a narrow range of government rights to demand keys. We do not grant access to keys by government officials who merely suspect a crime or wish to keep tabs on people whom they consider suspicious. But when there is enough evidence to prosecute, data kept by the user as well as messages intercepted by the government can be decrypted.

We will not recommend that Trusted Third Parties be required, but rest our compromise solution on the recognition that most citizens will archive keys with them for the purposes of exchanging data and online funds with strangers, and to protect themselves from their own folly. Key archiving is significantly different from key escrow or key recovery in that the archive copy is recoverable from the TTP only after the key has been invalidated against all subsequent use. In our system, a user and his correspondents would always know when a key has been revealed to the authorities; and this may help to prevent abuse of the system by figures inside or outside government agencies who lack proper authority to demand keys.

We envisage the following procedures in a Registered TTP system:

A request for an archival copy of a private key could be made by the citizen who has lost his key, by law enforcement after arrest of the citizen for criminal activity, or by heirs after a grant of probate if the citizen is dead.

The TTP, on receipt of the request, publishes an electronic update invalidating the public key associated with the private key from all further use

A 'notice of revocation' is generated by the electronic updating system confirming that the public key has been invalidated. This notice becomes a key to unlock the Registered TTP's key archive for the recovery of the associated private key.

The registered TTP recovers the private key from its archive using the 'notice of revocation.'

This system, coupled with an extensive network of TTPs working in accordance with the agreed standard, would appear to provide a sensible balance between the need for privacy in commercial affairs and the genuine requirement of law enforcement to be able to prosecute wrongdoing which is hidden behind encryption.

Archiving private keys with automatic revocation when they are recovered protects against the danger of 'spoofing' of transactions, which is a clear threat in all key escrow and key recovery systems. Key archiving means that law enforcement are not denied access to material which is reasonably necessary in obtaining convictions for fraud or tax evasion. However, citizens are protected in their electronic transactions from invisible attacks and surveillance by out-of-control security services.

Under our proposed regime citizens would still be able to communicate using secure encryption with each other without restriction, so long as the encryption was not disguising electronic commerce. Thus, Amnesty International would be able to continue using the strongest encryption systems available in its human rights work without having to escrow its communication keys with anyone. If, however, Amnesty International wished to sell publications or goods and services from its Web site, the private key it used for these purposes alone – an electronic commerce key – may well be archived with a registered TTP.

This system would additionally lead to a new offence – being engaged in secure electronic commerce using an unarchived private key. We envisage that the proof of this offence would be through good circumstantial evidence – e.g. unbreakable encrypted communications being exchanged at the same time as goods and services pass between the parties.

Recovery by law enforcers of archived private keys does not appear to fall foul of any of the restrictions which could be associated with the common law right of self-incrimination since the recovery would only occur in cases of alleged tax and revenue evasion. Self-incrimination is not a defence to a charge of tax evasion nor are documents ever excluded as evidence on these grounds in tax and revenue cases. Self-incrimination is relevant to protect citizens' 'right of silence' in the face of bullying interviews, uncorroborated confessions, and entrapment – issues which may not arise in recovery of archived keys.

Our system, as a compromise, will definitely not satisfy the most demanding parties in the debate – including some civil libertarians on one side and some law enforcement advocates on the other. Civil libertarians may criticise the proposal on the grounds that keys are still centralised and therefore vulnerable to break-ins (although not to unauthorized releases through normal channels). There is still the risk that a government can still obtain keys on flimsy pretenses and read messages obtained beforehand, although the key will be useless for future messages. Such a service would also require a lot of overhead to ensure that keys are released and invalidated at the proper time; this raises the cost of encryption. But the system would be much more open than the secretive one proposed by the DTI.

Law enforcement officials will criticise the proposal because it does not permit the police and intelligence community to intercept messages over a long period of time and read them without the users' knowledge, which they will claim is necessary for national security and the prevention of certain conspiracies. But we consider such a capability to be a major source of authoritarian infringement on the rights of citizens, and therefore have designed the system to rule it out. We further believe that conventional law enforcement investigation methods such as informers, electromagnetic monitoring, physical eavesdropping, 'social engineering,' and 'dumpster diving' give law enforcement sufficient tools to do their task without additional draconian powers regarding encryption.

10. Conclusion

'With the Internet we use the same technology at one point to achieve greater publicity and at other points to achieve greater privacy.' (Akdeniz, 1997a)

Clearly governments have the right and duty to safeguard their national interests, from security, economic, and law-enforcement viewpoints. It is also in everyone's interests to foster and promote 'best practice' in their commerce and industry. The use of strong encryption methods presents a problem for law-enforcement, while facilitating secure communication (private and commercial) and confering many other spin-off benefits which are completely legitimate. Few would deny that there are times when law-enforcement agencies will need to be able to insist on the decryption of enciphered material, or to demand the production of private encryption keys. This need, however, has to be balanced against the right to freedom of speech and association, and the right to personal privacy.

The infrastructure that would be established by the DTI's current proposals would not achieve for commercial organisations anything they are not already in the process of achieving for themselves. It would discourage a distributed system of key security whose strength is the fact that individuals and companies are responsible for the inviolability of their own private encryption keys. They promote a centralised system which would be an immediate and obvious target for intrusion and corruption, the failure of which would be catastrophic for the individuals or companies affected. The proposals would also place at a competitive disadvantage UK companies who are developing encryption products.

Encryption will increasingly be used on voice telephone networks over the next few years, but the centralised control of such encryption means that similar legislation to the 1985 Interception of Communications Act should be as effective as the Trusted Third Party scheme proposed by the DTI. Where the Internet is concerned, the TTP scheme will be singularly ineffective in controlling both crime committed over the Net and obtaining access to encrypted communications relating to criminal activity. Criminals are unlikely to surrender their private keys to any third parties even if some private individuals are persuaded to do so.

The Labour Party, as a part of their policy on Information Technology, stated that:

'Attempts to control the use of encryption technology are wrong in principle, unworkable in practice, and damaging to the long-term economic value of the information networks.' (Labour Party Policy on Information Superhighway)

Hopefully, sitting on the other side of the House will not alter their perspective too much. The measures required to protect the UK's commercial interests, and those required to facilitate law-enforcement where strong encryption is in use, are completely different. Neither of those worthwhile aims necessitates the prevention of lawful use of encryption by private individuals. The DTI proposals seem to achieve the last of these without providing any tangible benefit for the first two.

We believe that, for the preservation of human rights in the Information Society, it is important that strong encryption becomes widely used by the public. Consequently, genuine efforts by governments to support strong encryption's widespread use by the general public should be encouraged. There is a clear role here for the DTI (and similar overseas agencies) in the creation of an environment for citizens to use strong encryption in electronic commerce safely all over the world. Crime prevention is one requirement that should be considered in the encryption debate, but so should pressing issues of individual rights and privacy. Since we accept the need to break encryption at times in the pursuit of public safety, and recognise the powerful influence of law enforcement and intelligence agencies in the setting of public policy, we have offered a compromise proposal as an option for the users that avoids the worst dangers of both extremes in encryption policy and prevents the 'trusted third party' issue from being used as a Trojan horse in which governments can implement universal surveillance of non-commercial messages.

References

Akdeniz, Yaman (1997a), 'UK Government Encryption Policy', [1997] Web Journal of Current Legal Issues 1 (February) at <http://www.ncl.ac.uk/~nlawwww/1997/issue1/akdeniz1.html>.

Akdeniz, Yaman (1997b), 'Cyber-Rights & Cyber-Liberties (UK): First Report on UK Encryption Policy', May 1997, at <http://www.leeds.ac.uk/law/pgs/yaman/ukdtirep.htm>.

Bowden, Caspar & Akdeniz, Yaman, 'Cryptography and Democracy : Dilemmas of Freedom,' in a forthcoming book 'Civil Liberties and the Internet' by Liberty (Pluto Press - 1997).

Calcutt, David QC, Report of the Committee on Privacy and Related Matters, 1990, Cmnd. 1102, London: HMSO.

CommUnity UK Press Release, April 1997, at <http://www.pigpen.demon.co.uk/release.htm>.

'Cryptography Policy Guidelines', Organisation for Economic Co-operation and Development, March 27, 1997, <http://www.oecd.org/dsti/iccp/crypto_e.html>

Council of Europe Recommendation, 'Concerning Problems of Criminal Procedure Law Connected with Information Technology', No. R (95) 13, Sept. 1995, Appendix. Paragraph 8 calls for the ability to intercept computer communications and telecommunications, <http://www.privacy.org/pi/intl_orgs/coe/info_tech_1995.html>

Froomkin, A. Michael, 'The Metaphor is the Key: Cryptography, the Clipper Chip and the Constitution' [1995] U. Penn. L. Rev. 143, 709-897.

Garfinkel, Simson, PGP: Pretty Good Privacy, O'Reilly & Associates, December 1994, 1565920988.

Gavison, Ruth, 'Privacy and the Limits of Law,' [1980] 89 Yale L.J. 421.

Harris D.J., O'Boyle M., Warbrick C., Law of the European Convention on Human Rights, Butterworths, 1995.

Kahn, David, The Codebreakers, Macmillan Company, New York: 1972.

Koops, Bert-Jaap, Crypto Law Survey, Version 8.2, May 1997, <http://cwis.kub.nl/~frw/people/koops/lawsurvy.htm>

Labour Party Policy on Information Superhighway, 'Communicating Britain's Future', 1995, available at <http://www.labour.org.uk/views/index.html>.

Oram, Andrew, 'British and Foreign Civil Rights Organizations Oppose Encryption', April 9, 1997, at < http://www.cpsr.org/cpsr/nii/cyber-rights/web/crypto_brit.html>.

Rose, Lance, Netlaw: Your Rights in the Online World, Osborne McGraw-Hill, 1995, 0078820774.

Schneier, Bruce, Applied Cryptography : Protocols, Algorithms, and Source Code in C, 2nd Edition, John Wiley & Sons, 1996, 0471128457.

Taylor, Nick and Walker, Clive, 'Bugs in the System' [1996] 1 Jo. Of Civ. Lib. 105.

Wallace, Jonathan & Mark Mangan: Sex, Laws, and Cyberspace: Freedom and Censorship on the Frontiers of the Online Revolution, Henry Holt: 1997, 0805052984.

Warren, D and Brandeis, L D, 'The Right to Privacy' (1890) Harv L Rev 193.

Links

Materials

Organisations

EndNotes

[1] The earliest and simplest definition of privacy came from Judge Cooley. He defined privacy as “the right to be left alone.” See Torts 2nd. ed. 1888, page 29.

[2] It is important to note that Warren and Brandeis based their thesis on the English cases of Prince Albert v. Strange (1849) 64 ER 293 and Pollard v. Photographic Co. (1888) 40 Ch. Div. 345, but no similar development has occured in England. See Raymond Wacks, Personal Information: Privacy and the Law, 1989, Oxford: Clarendon Place, page 34.

[3] See Report of the Committee on Privacy, Chairman Kenneth Younger, Cmnd. 5012, London: HMSO, 1972, para.113, Report of the Committee on Privacy and Related Matters, Chairman David Calcutt QC, 1990, Cmnd. 1102, London: HMSO, page 7, Lord Chancellor’s Department on Infringement of Privacy: Consultation Paper, 1993, London: HMSO, para. 3.3 page 8 and the Government Response to the National Heritage Select Committee, Privacy and Media Intrusion, Cmnd. 2918, 1995, London: HMSO.

[4] The phrases ‘national security’ and ‘economic well-being’ are not defined further. But it is interesting to note that the UK Interception of Communications Act 1985 uses almost exactly the same phrases (see below).

[5] An anonymous remailer is simply a computer service that forwards e-mails or files to other addresses over the Internet. But the remailer also strips off the “header” part of the messages, which shows where they came from and who sent them. All the receiver can tell about a message’s origin is that it passed through the anonymous mailer.

[6] See the ACLU challenge to Georgia law restricting free speech on the Internet. ACLU and others stated that the law is unconstitutionally vague and overbroad because it bars online users from using pseudonyms or communicating anonymously over the Internet. The Act also unconstitutionally restricts the use of links on the World Wide Web, which allow users to connect to other sites. An ACLU press release dated 24 September 1996 is available at <http://www.aclu.org/news/n092496a.html>.

[7] Cases decided by the US Supreme Court such as Griswold v. Connecticut (1965) 381 US 479 and Roe v. Wade (1973) 410 US 113 show that privacy has been given constitutional status when the freedom of speech and the First Amendment is not in issue. This has been called a ‘penumbra right of the Constitution’. See also Katz v. United States (1967) 389 US 347 which is one of the most important cases establishing privacy for electronic communications.

[8] The above UK proposals use the term ‘key recovery’ rather than the US term ‘key escrow’. For practical purposes, there is no difference between the two terms.

[9] The credentials of the authors can be illustrated by a few examples: Whitfield Diffie, mentioned earlier in this paper, developed public key encryption. Ronald L. Rivest developed RSA, the leading form of encryption in commercial products. Bruce Schneier wrote the definitive book on computer cryptography. Peter G. Neumann has monitored a forum on risks for many years, promoted by the Association for Computing Machinery.

[10] The 1984 Act put control over lines into the hands of a private corporation, British Telecom.

[11] See Harris D.J., O’Boyle M., Warbrick C., Law of the European Convention on Human Rights, Butterworths, 1995, at page 339. The 1985 Act was upheld by the European Commission in Christie v. UK Appl. no. 21482/93, DR 78-A, 119 as in conformity with the ECHR. See Nick Taylor and Clive Walker, “Bugs in the System” [1996] 1 Jo. Of Civ. Lib. 105.

[12] According to Statewatch, these figures in the latest annual report from Lord Nolan only give - as usual - part of the picture. Under Section 2 of the Interception of Communications Act 1985, warrants to intercept communications are meant to be applied for by the Metropolitan Police Special Branch, the National Criminal Intelligence Service (NCIS), Customs and Excise, Government Communications Headquarters (GCHQ), the Security Service (MI5), the Secret Intelligence Service (MI6), the Royal Ulster Constabulary (RUC) and Scottish police forces. Total figures for warrants issued, England and Wales 1989-1995: 1989- 458, 1990 - 515, 1991 - 732, 1992 - 874, 1993 - 998, 1994 - 947, 1995 - 997. See ‘UK: Phone-tapping doubles in 5 years’, Statewatch Bulletin, Vol 6 no 3, May-June 1996, and also the Report of the Commissioner for 1995, Interception of Communications Act 1985. Cm 3254, HMSO, Report of the Commissioner for 1994, Security Service Act 1989, for 1995. Cm 3253, HMSO, Intelligence Services Act 1994, for 1995. Cm 3288, HMSO, MI5 The Security Service, 2nd edition, HMSO.

[13] A detailed explanation of Francis Bacon’s ‘bi-literal’ method of encryption together with the history of how it was used to allegedly prove that Bacon wrote Shakespeare is set out in The Codebreakers by David Kahn at pages 882 to 889.

[14] See the Prevention of Terrorism (Temporary Provisions) Act 1989, Schedule 7. Note also that the UK Police already had difficulties with encrypted files in the course of criminal investigations related to child pornography. See “Paedophiles use encoding devices to make secret use of Internet” The Times, Nov. 21, 1995.

[15] See also section 11 of the Criminal Procedure and Investigations Act 1996 which provides that the court (or with leave, any other party) may commet, and the court or jury may draw inferences, if the defendant fails to give a defence statement, gives a statement which contains inconsistent defences or, at trial, puts forward a defence which is different from that set out in the defence statement.

[16] See Cowan, Gayle, Ricciardi [1996] 1 Cr. App. R. 1. In R v. Cowan [1995] 3 WLR 818, Lord Taylor CJ stated that the phrase ‘such inferences as appear proper’ was doubtless intended to leave a broad discretion to a trial judge to decide in all the circumstances whether any proper inference was capable of being drawn (pp 823-824). See the annotations of Richard May in Current Law Status, ‘Criminal Procedure and Investigations Act 1996’ c.25, volume 2, 1997, Sweet & Maxwell. See also Anthony F. Jennings, “Resounding Silence”, [1996] New Law Journal 146, 6744 pages 725, 726, and 730

[17] See for example, section 3 of the Security Service Act 1989.

JILT logo and link to home page