****************************************************************************** H A R V A R D L A W S C H O O L Seminar on Law, Information, and Technology ***************** Dispatch five: Privacy and Protection ***************** *** *** *** This is a quasi-mailing list. One or two dispatches *** VE *** RI *** per week are sent to participants, with response *** *** *** encouraged; comments are integrated into followup mailings. ***** ***** The seminar has resumed after a spring break. **** TAS **** *** *** THIS WEEK'S GUESTS: Hon. Jamie Gorelick, Deputy Attorney ***** General of the United States; David Weitzner, Center for *** Democracy; Eben Moglin, Columbia University; Stewart * Baker, Steptoe & Johnson, former lead counsel for the National Security Agency. ****************************************************************************** Send requests, comments, and new subscription info to cyber@law.harvard.edu ****************************************************************************** IT'S BEEN AWHILE The seminar has returned from spring break with a major session on privacy. We make up for the lost weeks with this comprehensive dispatch covering that session--there is much of interest to report. Our next seminar is Monday, April 10, at 4:30 pm in Hauser 102, with another dispatch to follow. That session's topic is intellectual property. CYBERSPACE AND PRIVACY: WHAT'S THE ISSUE? Turns out that question has at least two answers. "Encryption"--loosely, the ability of individuals to encode their communications and data so only themselves or those they choose can read them--is one answer. People are increasingly living their lives in cyberspace, and digital data on a network is an order of magnitude more accessible to others than papers stuffed in an old filing cabinet. If the digitization and networkization of our lives makes the prospect of others--including the government--spying upon us more likely (and the fruits of that spying more numerous and varied), then encryption offers a defense of personal privacy. Under this theory, the federal government might be declared an enemy of personal privacy, especially in light of recent White House initiatives to promote a special kind of encryption, one that ideally permits high security against all undesired readers except the properly-warranted government. Some, however, consider encryption to be a red herring. They figure that the biggest threats to privacy do not come from the government as much as they do from private industry (including "information brokers") seeking to compile dossiers on private citizens for nefarious reasons ranging from honing commercial pitches to giving employers, suitors, and universities better understandings of employees, lovers, and students. In this case, the government might be one's best friend: a source of legislation and regulatory enforcement that would limit the amount or kind of data companies could legally accumulate and share on the general citizenry. ENCRYPTION IN THREE PARAGRAPHS OR LESS Encryption comes in many flavors. Common to many is the use of a secret key, by which encoded information can't be restored to its original form without the key, which is usually an easily-remembered (but hard to guess) password. Public key encryption helps people communicate securely even if they've never previously encountered each other and can't get a private moment together to swap secret keys. For example, thanks to public key encryption two people with properly-configured phones could call each other and talk secretly even if their lines were tapped from day one. Encryption has traditionally been the realm of spy vs. spy, and certain advances in cryptography were historically paced by advances in cryptanalysis- -the codebreakers could stay roughly even with the codemakers. Absolutely unbreakable codes exist, but they're unwieldy in practice and really-really- hard-to-break codes are thought to suffice. They suffice because the difficulty in breaking them inheres in the amount of *time* it takes to decode an encrypted message without the proper key. Publicly-available software today allows the general citizenry to encode messages that would reportedly take hundreds of years to crack even with the technical resources and prowess of the U.S. government. The idea behind a recent White House initiative for the "Clipper Chip" was a process by which public key encryption was supported, but with a "trap door": law enforcement could, once gaining a warrant, circumvent the otherwise-unbreakable codes protecting data, email, phone calls, etc. (The particular incarnation of Clipper proposed by the Clinton Administration included a "key escrow" system whereby an independent agency would actually hold the keys needed by law enforcement to intercept otherwise- protected communications.) IS UNBRIDLED ENCRYPTION A THREAT TO THE NATIONAL SECURITY? Some give an emphatic yes. It's bad enough that our hardworking intelligence services find increasing numbers of other countries' diplomats starting to use unbreakable diplomatic codes, they say. The big fear is that criminals will start using encrypted communications to plan drug deals, jewelry heists, and large-building-bombings. In the context of telephones, a wiretap (done in accord with legal process, including a warrant) has long since been Constitutionally accepted and touted as a way of investigating criminal conduct. It's passive, painless, and in many cases enables the state to catch someone red handed, planning or discussing a criminal act. Alternative methods of crimesolving are, like alternative methods of spying, thought to be less savory. Spying without eavesdropping often requires potentially more messy and discoverable (and therefore politically embarrassing) bribery. Crimesolving without wiretapping leads law enforcement to rely more on confessions or co-conspirator testimony; a focus on the former might tempt police to coerce confessions, and co-conspirator testimony is notoriously unreliable and hard to present convincingly to a jury. Widely-available encryption, then, is threatening to the government because it disturbs today's equilibrium. The game of cat-and-mouse, at least as it applies to trying to hold a private conversation with one's confederates, is in large part definitively ended when the government has no hope of cracking a scrambled phone call. Thus the spectre of a lot fewer solved crimes is raised. This might help explain the seemingly awkward formal categorization of encryption software as a "munition"--a weapon of war--for export control purposes. If the government perceives encryption as a threat then encryption tools--perhaps even encrypted data standing alone--are weapons. Such a metaphor (or reality, depending on one's view) might have other uses. Cypherpunks who stand uncompromisingly against government limitations on cryptography often don't simply sound the call of personal freedom. They also tend to point out the potential for government abuse of power. Encryption is a way of defeating the government without relying on its own forbearance. We *could* listen in on your calls, but we *don't* because it's wrong and illegal, says the NSA to an American citizen. The passive nature of wiretaps suggests to some that the government's refraining from listening in without a warrant requires some good faith on its part. To those who distrust government, then, public support for building an encryption system with an explicit back door for government cracking is based on a naive premise of government adherence to its own professed standards. Devising checks and balances--say, having the government keys located within an agency independent of law enforcement, or in the courts, or even in private hands--doesn't do much to assauge mistrust. This may be why Stewart Baker likened those who solidly defend the ascendance of uncrackable encryption to hard-line gun rights advocates. They simply don't trust the government, he claims, and just as some gun owners want the best firepower in case the government comes knocking, some cypherpunks insist upon the best cryptography as another way to keep a potentially honorless government at bay. Deputy Attorney General Gorelick saw staunch defense of unlimited encryption as just one leaf on a larger tree of cynicism and mistrust about government. A hearty defense of limited encryption in which the government retains a key may well depend on linking the abstract-sounding needs of "maintaining law enforcement" to the quite specific defense of law-abiding citizenry. Requiring buildings to have fire escapes is an infringement on the liberty of builders, but most of us can identify with tenants in a fire at least as easily as we can with contractors looking to pinch every penny. A crime in which encryption thwarted early intervention or late capture of the criminals might, some suggested, make the costs of unlimited encryption as tangible as the benefits. What if another World Trade Center-style bombing were in the works, but the site was unknown because the terrorists' communications were indecipherable? WOULD BANNING UNBRIDLED ENCRYPTION BE UNCONSTITUTIONAL? Good question, it seems, with uncertain answers, according to the constitutional mavens. One could imagine the government wanting to ban encryption tools--programs like Pretty Good Privacy, a free public-key encryption system. Possession of such a program might be bannable if, like the possession of burglary tools, guns, drugs, or radar detectors, the government could make a case that there were some rational basis for banning them. (Interestingly, the White House "Clipper Chip" proposal never included a ban on non-Clipper encryption; it simply hoped that Clipper encryption would become a de facto standard.) One could also imagine trying to ban the act of communicating with unbreakable encryption, or possessing encrypted data that could not be broken by the government. Here analagies become important. If encrypted communication is seen as conversing in a language known only to sender and recipient the First Amendment could be construed to apply. (Could the government ban Navajo or a special sign language if it thought they were being exploited by criminals who knew that the typical eavesdropping G-person knew English and perhaps a scrap of French, and that even the NSA didn't have a quick dictionary handy?) A restriction on encryption is thus possibly a restriction on free speech. Of course there exist restrictions on speech thought to be consonant with the First Amendment, but they're required to pass a much more rigorous test than run-of-the-mill restrictions on non-speech. Government advocates suggested that the First Amendment really shouldn't apply to encryption, because encryption isn't a *language* per se. Its purpose isn't to faciliate speech but to restrict it--to only the speakers. And perhaps a matter of degree becomes one of kind when the language in which two people converse is utterly unintelligble to even themselves without the aid of computers. Government advocates thus invoke the Fourth Amendment instead of the First. They uphold a right to general possession of encrypted data, but they believe the government must be legally entitled to gain access to such data so long as they obey the strictures of the Fourth Amendment--getting a warrant before seizing (and translating) it, and affording due process to the data's owner. Not simply legally entitled, in fact, but physically capable without the owner's cooperation. If a safe deposit box were made available that destroyed its contents if any attempt were made to open it without the boxholder's key, the government might successfully claim a right to ban the use of such boxes, not wishing to rely upon legal process to gain the original key. What, we are asked to consider, would we do if a terrorist organization maintained a list of sites it had wired with bombs on an encrypted hard drive and refused to give the key? If privacy advocates can only reply "Them's the breaks" it at the least again illustrates the costs of freedom and at most shows why some form of restriction on encryption might be Constitutional if not advisable. There may be compromises, however. It's been suggested to allow encryption but have an independent crime or penalty enhancement based upon the use of encryption to further a crime, just as using a gun to commit a felony toughens penalties in many jurisdictions. (Deputy AG Gorelick was unimpressed with that particular idea, believing that methods for solving crimes independent of wiretapping and data-grabbing were too weak to fill in the gaps.) Another suggestion was to permit data encryption but prosecute those who refuse to turn over keys when lawfully asked, pursuant to a warrant or subpoena. This could go some way in permitting encryption but not giving up the battle against crime, but some government advocates are unimpressed, exhibiting the same mistrust identified earlier in the cypherpunks: how can a person be trusted to give up the key even when asked nicely and pursuant to legal process? HOW ABOUT FORCING SOMEONE--WITH A WARRANT, OF COURSE--TO COUGH UP HER SECRET KEY? There's an additional problem. If giving up the key could incriminate the person, then the Fifth Amendment protection against self-incrimination might deprive the government of coercive methods against a keyholder such as jailtime for contempt of court. A Fifth Amendment defense can't be used in the parallel case of the safe-deposit boxholder who doesn't want to turn over her key, because that key isn't *testimonial* information, any more than a blood, hair, or breath sample is. The interesting aspect to a crypto-password is that it (perhaps arbitrarily, from a legal standpoint) transforms the physical key not covered by the Fifth Amendment to an *informational* key that is. Immunity from prosecution would eliminate someone's Fifth Amendment defense to a crypto-key request, but some argue that criminals shouldn't be able to hold such a strong bargaining chip--an otherwise unknowable key--in pursuing immunity. Others point out that the keyholder could be missing, dead, or otherwise beyond the reach of the government, and thus her data along with her. The safe-deposit box example represents physical contraband with a physical key. The encrypted hard drive is physical contraband with a mental key--which the government would like to see treated like the safe-deposit box. But why isn't it at least as likenable to a case of mental contraband with a mental key? People keep secrets in their heads all the time, refuse to divulge them, and there's no way to force them to or to extract the secrets with a brain scan. The Republic still stands. Perhaps a further constructive argument needs to be made in order to distinguish a super-encrypted hard drive from a mental list. Conspiracy theory--through which mere agreement with another person to commit a crime is itself criminal conduct--rests upon a belief that people acting in concert are more dangerous than people acting alone. Similarly, one could claim that when people can augment their own mental recordkeeping with computers, without having to suffer the usual disadvantage of possibly compromising information if the computers are lawfully seized, makes them much more dangerous. Extra restrictions might then be called for. BUT IS IT LOGISTICALLY POSSIBLE TO BAN OR LIMIT ENCRYPTION? Why should Constitutional arguments matter when the game is over? Public key encryption programs are already out there, and export restrictions notwithstanding there's good reason to think that everyone who wants them has them. One seminar participant presented an encryption program small enough to fit on a computer label (the font was about 5-point, but still...) with room enough for a WARNING: IT IS UNLAWFUL TO SHOW THIS LABEL TO FOREIGN CITIZENS banner. Today one might, through wiretaps, be able to discover who was using encryption and thus catch them were it a crime, but tomorrow illegal encryption might be a rotten tree easily hidden within a healthy forest of government-sanctioned (perhaps because of an escrowed key) encryption. The government answer so far appears to be a call to arms against heavy odds. Even though crime can't be eliminated it still might be worth attacking, and similarly the fostering of "compromise" escrowed-key encryption systems might prevent a middle-zone of unskilled, amateur criminals from availing themselves of totally encrypted communications. THE INTERNATIONAL QUESTION A footnote to the encryption debate: if a key-escrow initiative like the Clipper Chip gets off the ground, what are the international implications? Will other countries adopt standards that include a U.S. government bypass to their data, without even the warrant and process protections of the Fourth Amendment if non-U.S. citizens are using the equipment? Under what circumstances would the U.S. divulge its citizens' keys to other countries? Perhaps reciprocal key-sharing arrangements would develop between the United States and other countries promoting similar key escrow encryption schemes. Without such schemes, correspondents between countries could, say, double- encrypt their communications with both U.S. and French Clipper codes-- requiring both the U.S. and French government keys to intercept them. Unless the U.S. were prepared to try to ban all American communications with a French government-approved encryption system, a way to obtain an international encrypted conversation's corresponding French key would be necessary if U.S. authorities were to be capable of listening in. The international question is messy, and no one appears to have much of an answer for it in theory much less in practice. Stewart Baker predicts standardized "spheres" of encryption, organized geopolitically, within which government-approved encrypted communication is possible but between which the links are unclear. * * * PRIVACY AS ANONYMITY: KELLOGG'S SHOULDN'T KNOW YOU NEED HIGH FIBER Encryption is perceived by some as only a minor sidebar within a general debate about privacy and cyberspace. Those defending the motives and prerogatives of government in the encryption debate claim that the true threats to privacy in the Information Age come from freelancing or conspiring information pirates who seek to know as much about a person as they can and then sell that information to the highest bidder. Our most sensitive records will be protected by a scheme of law, they say, instead of encryption, in which case the government is an ally rather than an enemy. Our medical, financial, and personal records are becoming more cheaply and transparently distributable as they go online. Corporations, armed with a subtle and precise history of an individual's travels, groceries, and rental movie choices, can use that information for all sorts of objectionable purposes beyond the simple wrong of violating one's privacy to begin with. Advertising eerily tailored to a specific consumer profile, it was suggested, is only an early manifestation of burgeoning corporate power. What happens when people contemplating asking us out can first discern whether we have a weakness for greasy foods and smoking; when an employer can use a job interview to finally meet someone who regularly watches pornographic movies (and decline one to someone known to have three kids and be planning a fourth); when a health plan can find out how many times one has been AIDS tested (not to mention the results)? Defenses to such breaches of privacy will again depend on a legal or discretionary framework rather than a technological one. The local supermarket may well be recording each purchase as it floats across the register's scanner, and one can't easily forbid remembering who bought what any more than one can easily ban encryption. * * * CONCLUSION: PRIVACY AND EXPECTATIONS OF SAME. One of the foundations of Constitutional interpretations of Fourth Amendment law is the "expectation of privacy." A warrant is required only for searches in contexts in which a person could reasonably expect to have privacy. Thus the authorities can browse a notebook left abandoned in a field, or spy upon one's living room through an open window, without a warrant. The Constitution's limits on listening in on a cellular or cordless phone's calls might differ from those placed from a "landline" phone. Eben Moglin has pointed out a certain circularity when the legal and technological boundaries of privacy are uncertain. The Constitution cannot easily offer answers when the basis for a Supreme Court interpretation of privacy is people's expectations--which are driven by what public and private authorities are legally and technically able to do. For all the talk of transformations in our lives brought on by exponential change in technology, the changes are incremental. It has only gradually become commonplace for every Blockbuster Video to know where we live if one does; for someone to "NEXIS" us to see if we've been featured in our hometown newspapers as criminals or prom kings; for our calls to merchants to be "monitored for quality"; for someone who used to write the occasional letter to the editor to now hold forth regularly over a computer network to thousands of people; for California to have enough information to refuse to grant a medical license to a doctor sent packing from Massachusetts. The equilibrium of public and private spheres is shifting, haphazardly affecting the powers--and expectations--of each of us as citizens, merchants, and governors. It seems as hard to anticipate and plan for the future as it is to change it once it's here. ------------------------------------------------------------------------------ This digest was prepared by Jonathan Zittrain (Harvard Law School), Virginia Heffernan (Faculty of Arts and Sciences), and Heidi Messer (HLS). Response is welcome to cyber@law.harvard.edu.