Here is the transcript of the debate on the Clipper Chip held Jan. 19, 1995 at the Association of the Bar of the City of New York. The speakers were Stewart Baker (formerly of the NSA), James Kallstrom (FBI), Michael Nelson (White House), Daniel Weitzner (Center for Democracy and Technology) and William Whitehurst (Director of Security, IBM). The moderator was Bert Wells of Debevoise & Plimpton. -- Ron Abramson, Chair Committee on Computer Law Association of the Bar of the City of New York e-mail: dilute@panix.com --------------------------- THE ASSOCIATION OF THE BAR OF THE CITY OF NEW YORK 42 West 44th Street, New York, N.Y. 10036-6690 P A N E L D I S C U S S I O N O N T H E C L I P P E R C H I P - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - THE CLIPPER CHIP: SHOULD THE GOVERNMENT CONTROL THE MASTER KEYS TO ELECTRONIC COMMERCE? A policy debate on the "Clipper Chip" initiative. Is it necessary for law enforcement or will it unacceptably compromise personal privacy (or both)? Thursday, January 19, 1995, 7:00 p.m. House of the Association, 42 West 44th Street, New York, N.Y. Speakers: STEWART A. BAKER Steptoe & Johnson; Former General Counsel, National Security Agency JAMES KALLSTROM, Special Agent in Charge of the Special Operations Division of the New York Division of the Federal Bureau of Investigation. MICHAEL R. NELSON Special Assistant for Information Technology, White House Office of Science and Technology Policy DANIEL WEITZNER Deputy Director, Center For Democracy and Technology WILLIAM WHITEHURST Director of Data Security Systems, International Business Machines Corporation Moderator: ALBERT L. WELLS Debevoise & Plimpton Sponsoring Committees: Committee on Computer Law, Ronald Abramson, Chair Council on Criminal Justice, Barbara S. Jones, Chair Committee on Science and Law, Stephen H. Weiner, Chair Committee on Criminal Law, John J. Kenny, Chair Committee on Lectures and Continuing Education, Norman L. Greene, Chair - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - TRANSCRIPT OF PROCEEDINGS - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - NOTE - This is an informal transcript made from copies of the audio tapes that were made of this program by the Asso- ciation. There may well be (and probably are) errors in this transcript as to the words that were spoken and the identi- fication of the speakers who spoke them. The panelists have NOT been given copies of the transcript for purposes of correction. The transcript should be read with these qualifications firmly in mind. Hello, I'm Ron Abramson, Chair of the Computer Law Committee. I'd like to just ask a really tough question before we start. How many of you saw our posting on the Internet for this event? Very good. I'm very impressed. We've got a surprise tonight. We have an additional panelist -- somebody we've added in addition to the speakers that were listed in the announcement. Now before I get to that, I would like to thank a few people. First of all I'd like to thank Janet Steinman of the Law Department of the New York City Housing Authority who chaired the subcommittee in charge of this program and really did an awful lot of work in getting this together. Secondly, Bert Wells who is sitting to my left who is our moderator tonight, who is also a member of our committee, who worked very hard to get everyone here. I'd also like to thank our co- sponsors, Barbara Jones, First Assistant DA of New York County, who is Chair of the Council on Criminal Justice of this Association, who was extremely helpful in filling out our panel and with publicity for the event along with Sheena Neiman of the U.S. Attorney's Office; Steve Weiner, Chair of the Committee on Science and Law, who came through in a big way in terms of getting publicity arrangements for this event; John Kenny, Chair of the Committee on Criminal Law; Norman Green of Lace, and Carla Albergo of the staff of this Association, who did her usual great job in organizing everything and the rest of the staff. I'll briefly introduce our panelists this evening. I'll start from my left. We have Bill Whitehurst who is the Director of Data Security Programs at IBM. Mr. Whitehurst is responsible for ensuring IBM's worldwide activities in data security and privacy -- responsible to external requirements. He's a non-lawyer. Bill was trained in economics. He chairs the Information Industry Association Privacy and Information Regulation Committee -- that's a mouthful -- and he's a member of a number of advisory boards and committees relating to security and privacy, including the Federal Computer System Security And Privacy Advisory Board, which reports to the directors of a number of U.S. agencies, including the National Security Agency and also to the U.S. Congress. Next is Dan Weitzner, Deputy Director of the Center for Democracy and Technology. Dan is a lawyer. He was formerly Deputy Policy Director of the Electronic Frontier Foundation, "EFF". The Center's mission is to develop public policy solutions that advance constitutional civil liberties and democratic values in new computer and communications media. Mr. Weitzner's responsibilities in his new position include privacy concerns raised by new digital technologies. To my right is Mike Nelson, Special Assistant for Information Technology, White House Office and Science and Technology Policy. Mike is a non-lawyer, a Cal Tech and MIT-educated scientist, who works closely with the President's science adviser and with Vice-President Al Gore on a range of issues relating to the national information infrastructure, the "NII". To the right of Mike we have Jim Kallstrom, Special Agent in charge of the Special Operations Division of the New York Division of the FBI. The Special Operations Division is responsible for all technical and surveillance operations in support of all FBI investigative programs. Jim is a former Marine and has a Masters in Public Administration. He worked directly with Judge Freeh of the FBI in drafting the proposed Clipper Chip directive and we're very happy that he could make it on such short notice. Jim is the recent addition to our panel. Sitting to the right of Jim is Stewart Baker, a partner in the Washington firm of Steptoe & Johnson. He's a former Supreme Court Clerk, who has been in and out of government service, most recently with a two-year involvement as general counsel to the National Security Agency. While at the NSA, Mr. Baker was highly involved in national security issues relating to cryptography. This was of course during the period that the Clipper Chip initiative was proposed and he is now a well-known speaker on the subject. Finally, let me turn to our moderator, Bert Wells. Bert is an attorney with Debevoise & Plimpton here in New York and he's a hard-working member of our committee. He has probably done more work than anybody in terms of putting this panel together tonight. Prior to receiving his law degree from Yale in 1987, Bert was an Assistant Professor of Mathematics at LSU, and he has math degrees from both Oxford and Cal Tech. Relevant to tonight's debate, part of Bert's research efforts as a mathematician involved mathematical cryptography. I couldn't think of a more informed person to given an overview of the area and to moderate the debate and with that I'll turn it over to you Bert. Bert Wells: Thanks Ron. Are these microphones working? All right. Thank you. They're actually not audible from the front which is probably a sign of a well-designed PA system. Thanks Ron for that introduction. I'm here not only to introduce cryptography, but to be kind of the jargon police tonight. I'm going to try to keep our speakers in line. But in case they use terms that sound unfamiliar to you, there's a handout, a single-page handout at the front -- those of you who haven't gotten it yet might wish to pick up a copy -- that defines a number of the terms that are used constantly by people involved in this policy debate. I'm going to start with a term though -- cryptography -- and try to just explain it to you, because cryptography is the starting point for the Clipper Chip and this national debate over security and development of new means of law enforcement wire-tapping. Cryptography is a method of scrambling communications and information so that the scrambled information can be reconstituted and the original obtained by parties who are authorized to have access to that information. Typically, cryptography uses an "algorithm," which is a general method or technique for doing this scrambling of information, and a "key" or a set of keys. Now the key is the particular version of the algorithm that will be used to scramble or unscramble the information. An algorithm can only successfully encrypt and decrypt when the key to that algorithm is known to both parties to the communication or, that is, to the intended users of a communication. Now, historically, algorithms have been something that have been the purview of government and have been kept secret. Indeed to this day government encryption is generally conducted with secret algorithms, the details of which are not known to private individuals or academics, much less information regarding the keys that are utilized in connection with those algorithms. In the 1970s, however, there was a burgeoning interest in the private sector and academia in cryptography that could be strong and utilized by private citizens who didn't have the resources of government to build specialized machines or to come up with unusual algorithms that could be tested thoroughly. Indeed, in the late 1970s, there were mathematical advances that I think were probably the foundation for the now very active area of mathematical cryptography. Prior to those discoveries, private individuals were not able to utilize or create cryptography that would withstand government scrutiny. Perhaps in rare instances individuals could escape the government's intrusion into their affairs with cryptography, but by and large the strongest cryptography was always available to governments. And the cryptographic rivalries were government to government. For example, the breaking of the Enigma cipher in World War II which was an enormous advance for the Allied effort and represented genuine and enormous technological struggle between the nations. With the mathematical cryptography advances of the 70s and subsequently, however, a strong type of cryptography has emerged that can be put in the hands of commercial institutions or private individuals that, it is believed, can withstand even the strongest attack by well-financed and determined government institutions. By the way, some of the names that you will hear discussed tonight are typical names for cryptographic algorithms -- DES, usually pronounced "DEZ," the data encryption standard; RSA, which is an important kind of public key cryptography that I won't try to explain, but which is mentioned briefly in your glossary. Another algorithm that you'll hear, or may hear, discussed, is Skipjack, an algorithm that is still secret, and that was designed by the government for use in the Clipper Chip. I should add one other thing since I drew attention to the secrecy of government-developed algorithms. The RSA algorithm, the DES algorithm, and others of these significant advances in mathematical cryptography are algorithms that are not secret. Their operation is known to academics and to the private community. The security of the those systems can be scrutinized and has been heavily scrutinized by private individuals. How's cryptography used? I mean what's the point? Is this just a matter of keeping State Department secrets secret? Well, it's much more than that. Of course, cryptography is useful for cloaking communications in the commercial setting, most obviously in banking, where interbank transfers are occurring. Those kinds of transactions are typically protected with DES. But it's also useful for scrambling stored information that's kept in a fixed place that might otherwise be accessible to theft. Scrambling your files on your computer is a good way of making them unusable by other people. Furthermore, cryptography is a method of preventing or at least detecting the tampering with information. A good example of that is the Metrocard that some of us use in the subway system, which is magnetically encrypted information about the amount of fare remaining on your card and which, if tampered with, will not provide access to the subway any longer. Cryptography also in special forms can be used very effectively to identify people, to authenticate documents -- in a version called the digital signature protocol, it's possible to verify that a person has created or certified the accuracy of an entire document -- and, finally, one of the more exotic applications of cryptography is the development known as digital cash, in which it's possible electronically to transfer value through cryptographically created data. Clipper is a particular chip that implements a cryptographic algorithm. It was introduced officially almost a year ago, February 4, 1994, when the Administration announced a new administrative ruling in which it adopted a federal information processing standard for the encryption of voice communications, that is known as the Escrowed Encryption Standard or the Key Escrow Initiative, the Clipper Chip being a key part of that. In the Clipper system, uniquely manufactured chips are utilized in telephones to encrypt voice communications. The Clipper Chip acts in a remarkable way to both protect the voice communications of the people using the telephone and yet still to leave open an opportunity for government, if properly authorized by the courts, to tap into that encrypted conversation and nevertheless understand the scrambled data scream -- stream. It may sound like a scream actually. The actual technique that is employed by Clipper is described in your glossary. I don't want to take the time now because we have speakers with urgent things to tell us, but those of you who'd like to know exactly how Clipper works will find it set out correctly in the glossary. Finally, I should say that an important element of Clipper -- in fact, an essential element of Clipper -- is that there is a special key associated with each chip that the government gets to retain. When the chip is manufactured this key -- a sequence of 80 ones and zeros or bits -- is escrowed and stored by the government. It is split in a technical way so that two separate agencies, NIST and Treasury, each have half the key, and when authorized surveillance is being conducted that requires decryption of Clipper, those two escrow agents, as they are called, provide the two halves of the key to law enforcement officers who are entitled to conduct the surveillance and they then have the key. In fact, it's a device, a black box, as I understand it, that would allow them for a limited period to carry out the wiretapping and the decryption of the voice conversations. The last factual bit of background I'd like to give you before we turn to the speakers is just to mention that export controls are applied to virtually all cryptographic software and hardware. Our free rights to exercise our cryptographic talents at this point are limited to the borders of the United States and, except for very special applications such as cable tv boxes and other kinds of single-purpose applications, a particular license, a special license is needed from the State Department to export cryptographic tools or techniques of any kind. And as a result, these advances that industry and academics seem to think are wonderful in the cryptography area, such as RSA and indeed even DES, are very difficult to export. These export controls are alleged to be clamping down on an industry that would otherwise have explosive growth. So, with that introduction, let me turn to Stewart Baker and ask Stewart, why are Clipper and the key escrow initiative needed? Stewart Baker: Well I think Clipper and the key escrow initiative have kind of brought home for the American public a tension that has existed within the National Security Agency for years between the people who have tried to make the best possible encryption to protect U.S. communications and people who have tried to break the communications of our adversaries. And that duality, the recognition that encryption can be very valuable and that it also can be misused by people who are fundamentally hostile to American interests, is a concern that -- now that cryptography has gotten so cheap -- we all have to face as citizens. We all would like to have the privacy that cryptography can provide, but I think all of us have to feel a little uneasy about the ways in which it would be misused by crooks. Clipper was an effort to get the good parts of cryptography -- that is to say, the parts that provide good privacy to Americans -- without giving up the ability of law enforcement to conduct wiretaps to prevent Clipper encryption from becoming a tool that could be misused by criminals. And I think at least as a way of raising the debate, there's no doubt it's a success. Bert Wells: Well, Danny Weitzner, is Clipper actually needed to fight criminals or terrorists? Dan Weitzner: Well, it's, I believe at some level, a moot question. I'm not going to deny that the advent, as you noted, of strong cryptography implementable in personal computers, in small hand-held devices, etc., does not pose a serious challenge to law enforcement international security. The world is changing with regard to the way that people communicate and the way that people move information around, and this creates genuine problems. However, I believe it has now become crystal clear that Clipper is an absolute failure as a solution to that problem. There was a kind of a funny semantic debate in the press in the fall of '94, some people said Clipper's dead, other people said no it's not, and yes it is, no it's not. It is my belief -- and I don't mean to pre-empt the whole debate here -- but it's my belief that Clipper is an absolute dead end, that the ultimate test of Clipper was in the marketplace, the ultimate effort by the administration was to get the market, to get commercial users, to get individual users, to use this device called the Clipper Chip. And they voted with their checkbooks and their wallets and they said "No. We're not interested in it for a whole variety of reasons." And I think Bill Whitehurst may be able to talk a little bit more about that. That said, though, I do think that we still have some very very real privacy issues here that need to be addressed, if nothing else because national security and law enforcement feel that they have some very real issues that need to be addressed. The first issue is a very practical issue. That is that there are a series of government policies, namely the export controls that Bert mentioned, which are having a very tangible and practical impact on people's ability to use cryptography to protect their privacy. I'm not talking here about a Fourth Amendment right or a penumbral privacy right. I'm talking about whether people have access to basic privacy-protecting tools or not. And the export controls that have been in place for quite some time, and are still in place, create market conditions such that it has been very difficult for the market, both the domestic and the international communications market, to settle on cryptographic standards. And as you can imagine, it is relatively impossible to communicate using cryptography if there aren't standards. It means that you have to make individual, essentially bilateral, agreements with everyone whom you might ever want to communicate with in advance. It's like having to design a telephone system every time you want to make a telephone call. It doesn't work well. So there is a very practical problem created by the current state of U.S. policy with regard to the degree to which both individuals and businesses who are trying to use these new digital media to communicate, to do business, to do commerce, to do politics, to do a whole variety of activities, which the Clinton administration claims to be greatly in favor of, there's an enormous practical privacy problem. Bert Wells: Well Danny, what about Stewart's point, though, that the danger of cryptography is so great that we need to find a way to protect ourselves from that danger? Dan Weitzner: Well, as this debate has proceeded over the last few years, there have been numerous calls from privacy advocates, such as the organizations I've worked with and others, to illustrate in some kind of concrete fashion the real threats that are out there. The security agencies have by and large refused to do this. They've alluded to terrorist problems and other sorts of drug dealing problems that would be harder to monitor and control with, with strong cryptography. I guess all I can say is I that think the cat is out of the bag. Those who, whether for illicit or legal purposes, want badly to use cryptography will find a way to do it. I have a joke with some of my friends who are cryptographers that in the same way that there's a whole kind of sub-specialty of being a mob lawyer, there's going to be a sub-specialty called mob cryptographer. And I don't mean this as a joke at all. If you were moving hundreds of millions of dollars of drug money around and wanted to communicate in a sophisticated way about setting up international criminal conspiracies, I would think, sooner or later you're going to get the idea that there's a way to do this with near-perfect security both from law enforcement surveillance and probably from the other criminals you're competing against. So, I guess I would agree with Stewart's point that there is perhaps a... an urgency to protect against this problem, but the control of cryptography I do not believe will ever lead us towards any real solution. Bert Wells: Jim Kallstrom, let me ask you: Are there any concrete examples you can give us of what's at stake here about the government being able to retain the power to wiretap, either in the cryptography area or the pre-cryptography area? Jim Kallstrom: The FBI's in the information business. As opposed to the information of private citizens talking in the course of their daily business or a business community talking about their individual companies' processes or products, we're in the information business of collecting criminal information. Now who are these criminals? You all know who they are, you read the papers. They're the people that blew up the World Trade Center. The cost of which is estimated by the Port Authority of New York as somewhere around five billion dollars to the GNP of this area. They're the terrorists. They're the drug dealers. And the kidnappers, the extortionists. They're the child pornographers. [applause] That must be one of the child pornographers. [laughter] You know they're all these people that we are chartered to put out of business for the good of all of us. For the good of society. One way we do this is with electronic surveillance. A larger and larger part of the information that goes on the table in these crimes is obtained through electronic surveillance. So the notion that we're going to allow criminality to be beyond the scope of the law, beyond the scope of a search warrant; the notion that they're going to have a reservoir immune from the judicial process, the idea that we are going to give up on the notion of collecting the critical information about the kidnapping of a young child that might save that child's life. We're not yet ready to say the genie's out of the bottle. Not only does law enforcement have a large amount of issues here but the private sector does also. Would you buy a house, would you buy a car, if someone said to you here's the key, or here's ten keys. And by the way if you lose those keys, you're never going to get back in that house again. Or you're never going to get back in that car again. Or is a company going to operate in the business sector where a disgruntled employee could take all the trade secrets of that corporation and put them on a floppy disk for a series of computer medium encryption, and send them to India, or someplace else. There are major ramifications about this issue, that not only affect law enforcement, but affect all of us... Bert Wells: Okay. Jim Kallstrom: The notion that technology is just some sort of snake that's slithering along, and it should go wherever it likes, wherever the brain trust is going to let that go, without a realization that there's public policy issues here -- public safety issues here -- is I think a very naive approach. Bert Wells: All right. Well I think we have a strong statement there of what, what the government feels it needs at this point and what it's accomplishing by conducting this surveillance. Bill Whitehurst, the Clipper Chip was developed at great expense, the key escrow initiative was tested by the NSA, it's the kind of technology that's been designed and created by the government and being offered relatively cheaply -- indeed it's a voluntary technology, why should anyone turn down this gift? Bill Whitehurst: Well, I think first we must recognize that there are very legitimate business uses of cryptography. It most notably came to the front with electronic funds transfer by our international banking system. But I would submit to you, the more we read and hear about the global information infrastructure, all the information going between businesses across borders, I think you'll agree that in order for us to build trust in such networks, we've got to be able to secure such networks. There are international standards that have been agreed to. DES, one that you mentioned, is the standard in many nations around the world for encrypting information. It's also the standard that is used in the banking industry for funds transfer. And you also mentioned RSA, one that's used for digital signatures. Many firms have invested hundreds of thousands if not millions of dollars in computers and related equipment using those two standards. And, unfortunately, the Clipper standard or the Skipjack algorithm is not compatible with any of that installed base that's out there worldwide. First of all. Number two, it has been difficult for non-U.S. firms, and indeed in some cases government agencies that are not U.S. government agencies, to embrace an approach where the United States government holds the keys for all of these encryption chips. So although there are, you know, several other issues that I could bring up, I would highlight those two: not compatible with the installed base, and a concern on the part of non-U.S. countries and users for an approach where the U.S. government holds the keys. These are the two reasons why this, as you posed it, so-called free gift -- by the way one would still have to buy these Clipper Chips, it's not a free gift -- might not be very appealing to many worldwide. Bert Wells: Well Mike Nelson, is Bill right that the Clipper is not compatible with an installed base, that that's a real hurdle for industry, and that internationally Clipper and the key escrow initiative just don't seem to address the issue? Mr. Nelson: Well, we have to remember why Clipper was created. It was not created as a global standard and there are reasons why other countries would not want to adopt this technology. We should look back though why the Administration decided to go with Clipper Chip. For our own practical purposes the Administration -- the federal government -- needs to have good cryptography. As we move forward to build a national information infrastructure there's a whole host of applications that demand good reliable cryptography. Filing income tax forms, reporting data from businesses to government, just securing communications between government agencies. All that requires good reliable cryptography. We set about developing Clipper because we wanted to develop strong cryptography that would not undermine the ability of law enforcement to do its job. We were really faced with three choices. First choice was to adopt relatively weak cryptography, use that throughout the government, knowing that if we needed to we could break it and we could do a wiretap. Second choice was to adopt very strong cryptography, use that and just give up on the ability to do wiretaps. Clipper was the third choice. It was a technology that gave people very strong cryptography that would protect communications and files against unauthorized access, but in the event that law enforcement needed to do a wiretap, that would be possible, and that's why we chose to go that route. It is a voluntary technology: we're not forcing anybody to use it; we developed it for our own use; we think it meets a need; and we've seen some adoption in the marketplace, but not much. Partly because its only solving one particular problem. Its providing cryptography and encryption for telephone conversations. It's not solving the problem of encrypting data over computer networks; it's a hardware solution, so its relatively expensive -- many people in the marketplace would rather have a software solution -- although its not as reliable, it is cheaper. So again, Clipper was adopted for one purpose: to use domestically for federal government applications. If it spreads, we think that's great. On the other hand, we realize we've got to find some new technologies, and we are working closely with industry to find new ways to do encryption using software in a way that would provide very strong encryption without undermining the ability of Jim Kallstrom and his colleagues to do their job. So that's where we're moving. We hope we can find a global solution. We're happy to admit that the Clipper Chip will not meet needs of global businesses in every regard and we're willing to admit that there are many countries that will not adopt this technology. There have to be better solutions out there and we're looking for them. But as Version 1.0, we think Clipper does the job it was intended to do, provides government with what it needs, and we think it meets the needs of many people in industry. But we are working to find new solutions as well. And Bill and his company have been very helpful. Bert Wells: Well, Mike, is the Administration still seeking to implement key escrow in connection with data communications or is the focus of the initiative really voice at this time? Mike Nelson: Well, the standard, the Clipper Chip standard was for voice encryption. We thought there was a need there, and we adopted the standard for that purpose. We are using a related technology called Capstone in the Defense Department for its own internal defense messaging system. But at this point we haven't made that a standard, and we are looking for other solutions before we move ahead with that technology. As I say, we're looking at the different applications, looking at the different needs, working with industry to see how we can find solutions. Bert Wells: Danny, in your view, why should someone turn down Clipper? What's at risk here? I mean, aren't there tremendous safeguards on the government's use of the keys that are associated with Clipper? Dan Weitzner: Uh, no [laughter]. There aren't. I'll tell you one reason why I think users should turn down Clipper and why I would expect the government would even turn down Clipper: Sitting in this room is Matt Blaze who's a cryptographer at AT&T Bell Labs, who, something like six months ago or so, published a paper which noted that there are several -- one significant flaw in the Clipper Chip; like any complicated computer system, it's not surprising that these devices have bugs in them. As Mike said it's release 1.0. I think though, that, and I don't think any of us should be holding the government to a higher standard of software development than we hold Lotus, Microsoft, IBM, and all the others. But I think that that very practical experience that, you know in April '93 the government announced what was supposed to be advanced secure technology, and something like 18 months later someone working at Bell Labs, in fact working for the company that initially agreed to market the Clipper Chip, discovered a significant problem. And I'm not going to try to explain the problem, I'll let you or maybe Matt explain it later. But I think what this shows... Bert Wells: In fairness, I think the problem was one that affected the Capstone chip rather than Clipper itself. Dan Weitzner: That is, that is right. But I think it casts doubt on the whole program... Mike Nelson: Can I -- I'd like to respond to this. Dan Weitzner: Okay. I think what it really shows, though, is that it's a tremendous mistake to adopt a secret government designed algorithm as anything that any large community of users would rely on. One thing that the computer marketplace, I think, has been relatively good at is gradually, and sometimes painstakingly ironing problems out of systems. And the way that this has happened is that systems are available to the public to be tested, to be modified, to be redesigned if necessary. The -- I think the biggest reason not to go with Clipper is that it's a secret algorithm and there is really no confidence, I think, in the traditional measures of confidence in computer and communications security that the algorithm is really secure. Bert Wells: Well, what specifically do you think could happen, I mean, you think there might be design flaws but what are the outcomes of that? Dan Weitzner: The outcomes of design flaws are that someone discovers them, doesn't announce publicly that they've discovered them, and everyone who's using the device is in fact communicating insecurely where they thought they were communicating securely, that they're using these devices to speak on cellular phones, or on regular phones, discussing sensitive information, and in fact it's being compromised. You raised... Bert Wells: So, a criminal genius basically would be ... Dan Weitzner: Well, I don't know if you'd have to be a criminal genius. I suppose it would be criminal, but it's not clear you have to be a genius, you just have to know who to buy the solution from... Bert Wells: Well, let's turn to Mike Nelson... Dan Weitzner: I just want to address the other issue that you raised, as to government safeguards. As you said in your introduction, the keys to the communication encrypted with Clipper chips are held by two federal agencies. One piece of the key is held by the National Institute of Standards of Technology in the Department of Commerce and the other piece in the Department of Treasury. Now, I think that that has caused those of us in the privacy community enormous concern. Certainly, I think in the day-to-day course of events it's unlikely that there would be widespread abuse of that sort of system, but it seems like a system that is certainly susceptible to abuse when things become a little pressured for one reason or another in the country. The keys under this scenario are held by two federal agencies both of whom are responsible to the President of the United States. Under this scenario the President has no legal impediment to go into those agencies in requesting any keys that he or she thinks there is a need to have. There is no legal impediment whatsoever. The Administration specifically decided not to enact along with Clipper any statutes that would have governed the access to those keys. They adopted some regulations of their own initiative, but the regulations, I don't believe, and they've said are not binding. I will just add also that the Vice President, in fact, has said subsequent to these regulations and to the choice of these two escrow agents, that he thinks it's a bad idea to put the keys in escrow with executive branch agencies. So if he thinks it's a bad idea, I have to agree, and we should all agree. Bert Wells: Mike, I know you want to address the issue of poor design, but let me turn, since we just heard about government safeguards, to Stewart Baker, would you like to... Stewart Baker: He touched on both of those issues. Bert Wells: All right, Mike go ahead... Mike Nelson: You've got two red herrings here, perhaps even a whole can full of them. First off, the safeguards issue. This is something we've heard a great deal about. But the important thing to remember is that in order to get the data stream in the first place, the government has to do a wiretap. In order to intercept this encrypted communication that we're worried about, a wiretap has to be done. So all of the present safeguards that are there today with regard to wiretaps will still apply. Then after the law enforcement has gotten the intercepted message, then they have to go through the process of getting the keys decrypted, going through a process which is auditable, which is open, which requires that there be just cause. I don't know exactly how to make this clear, but we need to compare Clipper to the way the world is today. Today we make conversations which are unencrypted, in the clear, not only law enforcement, but the fellow down the street who has a cellular phone scanner, people who might do their own unauthorized wiretaps all have access to your conversations. With the Clipper Chip you have the ability to ensure that unauthorized wiretaps don't occur, that nobody can listen in on your conversation, without you knowing about it. So you'll have two layers of protection in this future world where you're using the Clipper. You'll have the wiretap statutes that require just cause for a wiretap in the first place and you'll have the additional protection provided by encryption. That's the comparison we should be making. On the second point, about the Blaze attack, this is really, this is a total red herring. The Clipper Chip algorithm works. The Blaze attack doesn't indicate that it doesn't work, your communication is still encrypted. Mr. Blaze found that if you want [inaudible comment from audience]. Dr. Blaze, excuse me...found that if you spend, if you're willing to spend half an hour when you place a telephone call, you could use the Clipper Chip in a way that would not have a law enforcement access field. So you could undermine the ability of law enforcement to do its job. You'd still be able to use the technology... The technology is secure. The only thing is that you could undermine the ability law enforcement to do its job. But you'd have to wait, it would take half an hour and a lot of computer processing to make that happen. There are lots of other ways to do that, it's not a real problem. It was something we knew when we designed the chip. It's not a problem. Bert Wells: Well, Stewart Baker, just about the specific point that the President can obtain the keys whenever he wants and actually there may be special rules, perhaps you can tell us, that apply to foreign surveillance in this area. Is that a correct point that the President can get the keys whenever he wishes? Stewart Baker: The rules that govern the escrow authority require that anyone who wants to have access to them [the keys] have lawful authority to conduct the underlying wiretap. In the rules relating to the requirement of a warrant there are provisions for emergencies and the like. I think what Danny Weitzner is suggesting is that those rules could be rewritten in an emergency and I suppose you can't guarantee against the rules being rewritten in an emergency. And that is a risk that we incur by having encryption that has a government control element to it. On the other hand, the only alternative that anyone has come up with that provides security at this stage is encryption in which that decision is left to every individual including every crook who wants to use this. And so that in a genuine sense we have a question of "who would you rather trust?" Would you rather trust this to the marketplace in which people will make their own judgments, and you won't be able to conduct wiretaps against people who misuse it, or are you prepared to trust the democratic institutions and the checks and balances on power that have worked for our country by and large over the years? I guess I think that this debate in the end is between the people who would rather have some kind of an automatic technological guarantee against the government misusing their authority and people who are prepared to trust our institutions to prevent abuse. Seems to me that the choice of guaranteeing against government ability to interfere in, to conduct wiretaps is a choice that uh, probably is a choice for anarchy, a choice for more authority and more opportunity for criminals, and I don't think that on the whole, looking around at our society, that we need too many choices of that kind. Dan Weitzner: I want to just have a chance to expose my own little red herring here. to follow up on Stewart Baker's characterization of the choices that we face here. This seems to be a choice between anarchy and order or apple pie and the devil. I want to just say again that I really do believe that this new technology creates new problems and new challenges; but to pretend that we can somehow make a choice to put all of those new challenges back in the bottle, that we can somehow erase from the Internet, erase from computer science textbooks that are all over the world, and erase from people's brains the knowledge of how to do this kind of encryption is fantasy. And I think it is irresponsible policy-making to turn a blind eye to all the problems that Jim Kallstrom has raised. I would submit to you that if, as I said before, if someone is planning a serious criminal conspiracy, and is going to go to the trouble of using all kinds of fancy communications, the likelihood that they are going to go down to Radio Shack and buy the modem or buy the telephone that has stamped on it "approved by the NSA" is very slim. And what we are in fact trading off is the privacy rights of the vast majority of the population who are law abiding citizens who have no reason to, to submit themselves to a search in advance, which I think you might be able to see a government escrow agent situation as. We're trading off the privacy and the security of all these people in society. We are threatening the trust level that Bill Whitehurst mentioned, in order to adopt a solution that I don't think gets us anywhere. So, let's stop pretending that we have a choice, which really advances either interest here. What we have is a new problem. Bert Wells: Let's follow up on that trust issue. Bill in your view, are there adequate reasons to trust the federal government that the clipper and key escrow will be adequately administered, or, do you have a concern with that side? Bill Whitehurst: Well uh, actually I would like to avoid your question entirely, because [laughter] for two very good reasons . . . [laughter] Bert Wells: And you said you're a non-lawyer. Bill Whitehurst: Well for two very good reasons. First I don't want to criticize our government in this forum when I don't in fact know exactly how everything would be handled. But more importantly, this is not a U.S.-only problem. The Clipper initiative appears to me to be a U.S. initiative, but the world of electronic commerce is quickly becoming borderless. And so now the issue becomes -- does a particular company, wherever they may be situated, trust some other government. And I would answer that question affirmatively for certain governments, and I would say no for other governments. There are certain governments that I don't trust. And if there's going to be an international solution to this problem, everyone that's part of that network is going to have to trust that implementation. And so I think that, that by framing the question of trusting only the U.S. government, makes the whole debate a national debate and I would suggest that's quite a different debate than the debate that should really be held. Bert Wells: Well Bill, let me follow up with another question. Bill Whitehurst: Sure. Bert Wells: Are there alternatives to Clipper that satisfy the governments' law enforcement concerns, but are not so invasive on business concerns, or indeed on individual privacy rights? Bill Whitehurst: I would like to answer that in two ways, as well. The first is, is that as a result of a lot of negative comments about the whole Clipper and key escrow approach, the government did form a work group called the "key escrow alternatives working group." It met June 10, 1994 for the first time, and met the second time in August. This group is made up of, as I recollect, about fifty to sixty people representing large firms, some individuals as well, looking at the question of: are there alternatives that would still maintain the balance that the government was trying to strike with its Clipper initiative. The concern that I have is that this group has met only twice. This is where we find ourselves almost two years from the point when Clipper was first announced, and a year after it was adopted as a federal standard. The group asked the government four to five rather basic questions of "would this be acceptable to you or not?" And so far the reason the group has not met again, since August, is the government has informed us that these are really difficult questions and they're not prepared to answer them yet. So, I would submit that the private sector is trying to work with government and at the same time I would express a frustration at the rate of trying to reach agreement. Bert Wells: Mike Nelson, do you care to comment on that? Mike Nelson: Well I've been very involved in creating this alternative working group and I think there has been some real progress made. The questions that were asked by industry were very good ones, but they are ones that have a major impact on our national security policy. And so we are taking our time to make sure we come up with answers that we can live with. Bert Wells: Well, what specifically are these questions? Mike Nelson: Well the main question is, is would the U.S. government allow the export of DES and other advanced cryptography, if there were a way to make sure that it were escrowed. This seems like a very reasonable request, and it's one that we have looked at and we'll have an answer for industry in about a week or two. And, we plan to have this meeting as soon as possible. So we're moving ahead on this, I'm very happy to say. Bill Whitehurst: You heard it here first, I guess. Mike Nelson: You did. [Group Laughter] Mike Nelson: The point is that we can all imagine what the ideal answer to this problem will be. The ideal answer is: an encryption product which can be implemented in hardware and software, one that relies on an unclassified algorithm, one that is inexpensive and easy to implement, one that can be used globally, and one that does provide strong encryption while not undermining the ability of law enforcement to do its job. That's our goal, and the Vice President in a letter to Maria Cantwell, representative from Washington State, laid that out, said that's were we headed, that's what to accomplish. And we're working with industry to do that. Now the Clipper Chip meets several of those criteria. It doesn't meet other ones. Particularly, the problem with the classified algorithms -- something we recognize. A classified algorithm makes it harder to use it in many different applications. On the other hand, the only way we could accomplish what we needed to accomplish, the only way we could make it law enforcement friendly, was to use a classified algorithm. We're looking for new ways to do it without that, but so far the only answer we've got, is the Clipper Chip, and that requires use of the classified algorithm. So, we are moving towards this ideal, we think we can maybe get there, but in the meantime we're not about to say "we don't care about law enforcement, we don't care about national security." And I do think this group will be very helpful, I do think the dialogue that's been going on has been very productive. I wish it started long ago, but we've only been in this place for two years and in that time we have had a lot of useful discussions. Bert Wells: Well there are several points of departure there, but one thing that I want to try to get clear at first is, are there actually any concrete existing proposals that do something pretty much like what Clipper does, but that satisfy business needs as well as law enforcement needs. Mike Nelson: There are some very interesting proposals from a couple of companies that would develop a product using DES as the algorithm and using software techniques that would allow you to couple that algorithm with the key escrow feature. It would work in much the same way as Clipper, in that you would be able to intercept a message, determine what product or what edition of the product was used to do the encryption and then you would be able to get the key. And we are talking to people about these possibilities. Now, having the concept, having the block diagram is one thing. Having the implementation is another. And, it's going to be a few months before we can actually see if it works and we can take the best cryptographers in the world and pound on it a little bit. It took awhile for us to pound on Clipper to make sure that we were convinced that it worked. And we did have people evaluate it outside of government. It's classified, but a number of the world's top cryptographers had a chance to look at it and see what they could do. So it has been somewhat tested. It hasn't been tested by thousands of people, but some very good people have looked at it. It takes time to evaluate these products. And, again, I'm glad we're engaged with industry in finding this ideal solution which we all agree would be better than Clipper is today. Bert Wells: Dan is there any concrete proposal that you're aware of that addresses both law enforcement needs and the privacy concerns that you represent? Dan Weitzner: I'm going to take a little risk here and talk about this proposal that, to be honest, I'm not sure how I feel about. But I'm really glad to hear Mike say that the Administration is still moving down the road that was articulated in the letter to Cantwell, even though she wasn't lucky enough to be re-elected. I'm glad to know the letter is still good. Mike Nelson: Policy is policy. Dan Weitzner: I want to say two things about that structure. First of all, I really do think that one of the lessons of Clipper is that Government should not be in the business of trying to design software systems and security systems either for the whole country, or certainly not for the world. I think what the government can do is to lay out a legal framework -- that means an export control structure, that means issues about the liability of escrow agents -- that is somehow reflective of the various interests that I understand government is trying to balance. So, I think that one of the things that we found encouraging about the Cantwell letter, uh, and I'll just kind of tick off the principles that the letter articulated as policy goals, as Mike said, it should be software implementable and software, the algorithm, should be unclassified, the use should be voluntary, it should be exportable, there should be strict liability for improper disclosure by escrow agents, there should be auditing procedures and of course the escrow agent should be able to be private. Now, probably the most contentious issues, I would say from all sides, really have to do with voluntariness and exportability. The simple fact of the matter on exportability is the National Security Agency does not want to allow any cryptography to be exported. In their honest moments, people from the National Security Agency will say to you very directly that they recognize that widespread use of cryptography poses an enormous problem for them, and all they really can think to do now is just to delay that circumstance as long as possible. So, I think there is an uphill political battle on export controls, but I think it is a critical component of this for the reason that, unless these sorts of systems are exportable, I think it's going to be very difficult to have them widely available even in the United States. When Microsoft or Apple or IBM or anyone else goes to design operating systems and other computer communications tools, uh, they are reluctant to design one domestic version and another version for export. And as I said, it is very hard to agree on standards when you can't do the same thing on two sides of the border. Exportability is going to be a critical issue and I think Mike has recognized that as well. The second is voluntariness. There is a concept floating around called software binding. That would essentially say that cryptographic software may be exportable provided it has written along with it an escrow system that forces escrowing. That would say that anyone who uses that off-the-shelf software package or piece of hardware would be essentially forced into an escrow relationship of some sort or another. It does not attempt, though, to preclude the possibility of unescrowed communication. I think that where the line gets drawn on this issue of voluntariness, on the question of how far the government thinks it's going to go in forcing all of us to escrow our keys is an absolutely critical issue from a privacy perspective. I'm going to, just for a moment, talk about the Fourth Amendment, because I think it is actually relevant here. If you think back to the enactment of the 1968 Federal Wiretapping Law, one of the critical objections from the civil liberties community, which was a widely shared objection, was that wiretaps constituted secret searches. Going back even before American law, back to British Common Law, secret searches have been regarded with great, great suspicion. And they have been treated by courts in our tradition for many hundreds of years as something to avoid at all costs. There was an argument made in the context of wiretapping that there was reason to create a small exception to the so-called "secret search" rule because telephone conversations were somehow regarded as special things, as unique things. I think that our underlying Fourth Amendment concern here -- which I think has to be reflected in this debate, and has to be reflected in the policy options that are ultimately chosen -- is that we keep in mind that the Fourth Amendment guarantees citizens' rights against unreasonable search and seizure. The Fourth Amendment does not guarantee, and nothing in the Constitution guarantees to government, the right to conduct surveillance at all times and in all places. I certainly recognize the public policy imperative to be able to do surveillance in certain cases, but we should really remember where the legal tradition is here. We have a wiretapping law which was a narrow exception to the secret search rule, and we have hundreds and hundreds of years of common law that says secret searches are dangerous to democracy. Bert Wells: Is it your view that Clipper constitutes persistent, pervasive, continuous surveillance? Dan Weitzner: Well I must say that, if it came down to it, I think you could construct such and argument. I guess my view is that there is certainly an argument that it is forcing one to consent to a search in advance of any criminal behavior. I don't want to get lost in that kind of argument. I think the lesson that we should draw from our Fourth Amendment tradition, is that there is not an absolute right on the part of the government to be able to conduct surveillance in every place. We should really remember in this whole context that we're talking about the ability, or the lack of ability, to decrypt a communication stream. That has no bearing at all on law enforcement's access to information once it's through being communicated. Unlike telephone conversations, most other electronic communications such as electronic mail, once they are communicated have to be deposited somewhere and to be read or used -- have to be decrypted. It seems to me there are many opportunities, and we know from law enforcement activities that there are many other opportunities, for law enforcement to have access to what was communicated. It's on a hard disk, it's on a tape, it's on any number of other kind of media, it may even be printed out on a piece of paper. Jim Kallstrom: If in fact the victim is dead by the time you get it, does that have any bearing on your view? Dan Weitzner: Absolutely, and I'm really not trying to play the game of denying legitimate interests here Jim, but I think that when you talk about criminal conspiracies... Jim Kallstrom: You know, the Government also has a basic obligation. That's to protect the citizens. You know, it all depends who you talk to. If you talk to people that have been on the fringe of being victims of crime, if you talk about the families of the 275 people that were on an airliner that was about to get shot out of the air, lifting out of O'Hare airport, they have a very different opinion about what the capabilities of the FBI should be, what the capabilities of law enforcement should be. You know, here we have a huge problem that maybe you should be looking at, where there's hundreds and hundreds of databases in this country that track everything you buy, everywhere you go, every time you go to the grocery store, and to single out, you know, the law enforcement agencies to be a big bugaboo when there has been twenty years of basically unfettered compliance with the wiretapping laws, I think it's really, gets us off on just an absolute wild goose chase. Bert Wells: Jim, how much wiretapping is going on, and how many taps per year? Jim Kallstrom: Well, they average about 900 taps a year by the federal, state and local law enforcement authorities. Thirty-seven states have adopted the wiretapping law through their state legislatures. About sixty percent of those 900- 1,000 taps are done by the state and locals and the other 40 percent done by the federal authorities. The FBI does the preponderance of those, and DEA would be the probably the second. Bert Wells: What percentage of criminal investigations would that represent? Jim Kallstrom: It would represent a very high percentage of the major program investigations. When you also throw in pen registers, dial number recognition devices, and things like that. Almost all of them. Bert Wells: I want to follow up with Stewart Baker on a point that you raised, Dan, which is the issue of whether a private party can be the escrow agent here. Stewart, do you necessarily believe that key escrow agents ought to be government institutions, which you know, by their nature can attract distrust, particularly from foreign entities, as opposed to private but very heavily regulated entities that are obligated to respond to legitimate wiretap orders. Stewart Baker: I don't think anybody would rule out the idea of trustworthy private organizations doing this. You have to remember that when Clipper was put forward it was done fairly quickly against a background that required quick action. The likelihood that a private entity would step forward and volunteer to be an escrow agent without any idea of the kinds of liability or litigation that might attract, let alone the publicity, was virtually zero. I think there are still probably . . . Dan Weitzner: The Electronic Frontier Foundation offered. (laughter). Stewart Baker: I think that as people get more comfortable with the idea and as people realize that there are going to be a lot of corporations who are going to want to escrow their keys, without regard to the question of Clipper or law enforcement, just so that they can conduct their own kinds of security procedures and guarantee themselves against misuse by their employees in encryption, you're going to see companies look very closely at the idea of escrowing keys privately. Mike Nelson: We're actively exploring that. Stewart is right, and the Vice President has given us instructions to go out and find other escrow agents. The fact is that we needed escrow agents quickly and we needed escrow agents to hold the keys for government use, so we picked two government agencies. But we are exploring other options, as just as we are exploring other technologies. Bert Wells: And would those private agents be ones that would post a bond or put their assets at risk for an improper disclosure of keys or losses that result? Stewart Baker: It's hard to know how you would organize that and where you are doing it for private companies it's easy enough to regulate the liabilities and responsibilities of each side by contract. Its harder vis-a-vis the government, although I guess I would note that the government already has a lot of subpoena authority, and so if they're confident that people will respond promptly and appropriately to subpoenas, a lot of the legal framework for obtaining keys in event of necessity is already there. Bert Wells: Stewart, let me also ask you a question that I think you have some direct experience with. There have been several references here to the bottling up of the technology created by export controls and the intrinsically international aspect of the information infrastructure. But we do have these fairly rigid export controls it seems to me that make it hard to export strong cryptography. And yet systems such as RSA and DES, which is perhaps not as strong, are completely disclosed publicly. In fact, I think DES is fully defined as a government standard. So the construction of interoperable systems by anyone overseas who simply has a copy of the government publication is achievable. And as far as RSA and other public key cryptic systems go, those algorithms are known publicly. We can't keep books from leaving the country. We can't keep articles that are posted on the Internet from being picked up in other countries, so that other people can indeed not only develop techniques using these encrypton systems, but build products that they can sell back into the United States or sell abroad, that incorporate these things. So, so in light of that reality, and the publicly disclosed nature of these encrypton systems, why do we need these export controls, why do they make sense? Stewart Baker: I think that sometime in the early 80's the government stopped trying, what it had tried up to then, which is to prevent even academic and political discussion of encryption principles from becoming public. There was a time when cryptography was viewed like nuclear bomb-making technology, as something that ought to be born classified. It was too sensitive, too important to have it become something that is discussed academically. Bert Wells: In fact it is on what is called the munitions list, isn't it? Stewart Baker: Yes. Well, and I think at the time it was put on the munitions list there wasn't any doubt, it was an implement of war, it was a critical tool for winning wars, both making and breaking codes were critical parts of the battle in World War II. So it was appropriately viewed as a munition. However, I think that in the last ten years it has become increasingly attractive as a commercial technology as well. And, just as Jeeps used to be munitions and became dual purpose technology, I suppose, you can see that evolution in the cryptographic field. What the government's export control policy has tried to do over the last few years has been to permit First Amendment academic discussion of algorithms and cryptographic techniques while trying to be pretty careful about controlling commercialization of those techniques on an international scale. The argument tends to be that the export controls have failed but there hasn't been commercialization. It's sort of hard to have it both ways. I think controls have in fact prevented commercialization of cryptography in a lot of fields where you might otherwise have expected to see it. What is happening now, and as the barriers have broken down, and they have broken down to a degree, I think, permitting the export of the RC4 algorithm was a step forward for the software industry. As soon as that happened however, I think people who were in the business of exporting cryptography discovered that importing cryptography wasn't a picnic. And, in fact, I think that the French are making life very difficult for people who want to export cryptography from the United States to France. Not because of U.S. export controls but because of French import controls. I suspect that what has happened, one of the reasons for the moderation of this debate, is that business interests have begun to see that even if they win a fight with the United States government over export controls, they're going to have another fight, and maybe one in which they don't have as many weapons, over what will be allowed to be imported into a lot of countries. And, however we may feel about the balance between individual rights and law enforcement's authority, wherever we put that balance, I think it is fair to say that practically every other government in the world puts it a little closer to government authority than we would. Bert Wells: Well Bill Whitehurst I would like to put those points to you. Have the export controls been successful in preventing commercialization of cryptography? Are the codes that can get through, such as RC4, adequate for business use, and what about the battles that you may have to fight with foreign nations over the use of cryptography. Bill Whitehurst: Well certainly I think its appropriate that we understand a bit of history here, and that is that IBM invented the basic algorithm that became known as the DES, in one of our research labs in the state of New York, the Watson Lab. This was as the result of a call by government that the private sector come up with some kind of an algorithm to help protect not only the private sector's information but government's information; an algorithm that could be used on the non-defense area of government. And, we were quite proud at the development, and then subsequently after it was made the data encryption standard in the U.S. as well as other countries, then export controls were slapped on implementations of that standard. But by that time, the standard itself was well known, it was well published, it was the subject of a patent that was issued to IBM. So we have a "Miracle on 34th Street" issue here where one arm of the United States government publicly acknowledges that this is in existence, while another part denies it. Now, that's a little historical perspective on the DES. We've had discussions over the years on these export controls, trying to understand what the problems were, et cetera, in light of the fact -- that has now been brought to light through various investigations -- that there is substantial implementation of the DES by non-U.S. manufacturers. And, indeed, I'll just state specifically as a vendor ourselves, IBM, we get very unhappy when we are in a sales situation in a foreign country and find out that we loose that sales situation because we can't export our DES device, but someone else sells the DES device to them. So, are export controls working? In that case that customer who wanted a DES device got a DES device. Export controls did nothing but hamper the ability of U.S. firms to sell solutions to that customer. Now these customers, by the way, are large, in many cases, large multinational customers themselves, not homebased in the United States but homebased elsewhere and they would like to go to one vendor and get one solution that would in or operable in their networks worldwide. And they get very frustrated that they can't be provided that solution by some of the leading technology firms in the world, those that we have right here in the United States. The second point that I would like to make, is that I get very concerned when I hear the situation in France described the way it was described earlier here on this panel. France does not have an import law. France has a registration scheme requiring users to fill out an application describing the algorithm that the user is using, and provide that to the government. The government will then issue what we might think of a license to use that cryptographic device. We're situated in France. We have cryptographic devices installed in France. We're licensed by the government of France. We comply with all their laws. It is not an import control law, but I will acknowledge that it is one of the few countries worldwide that has any particular usage restriction of crypto. So the main problem here is the export control laws of the United States, not any foreign laws regarding import or usage. Dan Weitzner: I just want to come back to a point that I think Stewart raised. I'm sorry -- it was Bill. Some of the paradoxes here in the state of public policy coming from different parts of the government, and in some cases even the same parts of the government. Stewart did mention the book exception and some of the free speech and academic freedom issues that have caused the government to allow at least international discussion of cryptography. I think there's an interesting case now that is working its way through the administrative process of the state department and the National Security Agency; someone has requested a license to export a certain kind of cryptography, both in book form and on magnetic media -- I think it is just a piece of software -- that is written to implement what is in a book. The book is allowed out of the country. The software is not. Now, I don't raise this issue to make fun of the government, but I do raise the issue because what I think it illustrates is that we really are at somewhat of a crisis point and a point of enormous contradiction in federal policy on this matter. Something is going to have to give somewhere. You can ascribe the cause and source of this crisis to all sorts of factors. That's not really relevant, but I think you have enormous gaps between the different views of the same reality and I think that really indicates that it is time to sit down and try to push this whole thing to a resolution. Bert Wells: Mike, is Bill right that sales are being lost by American corporations without any advance in U.S. policy goals? Mike Nelson: Well I think that there probably are cases where U.S. companies are unable to sell crypto products and they lose sales for that reason. On the other hand, we are accomplishing the goals of our policy. As Stewart said, we are avoiding the rapid spread of standardized crypto overseas. And that's what we're out to do. The fact is, our policy is not designed to ensure that one terrorist somewhere cannot talk to another terrorist using crypto. That's going to happen. We are trying to keep this technology out of the hands of Moammar Khaddafi or the Hezbolah terrorist gangs in Lebanon, and have them use it in a widespread way. We don't want this everywhere. We don't have controls domestically, but we have a lot of serious threats overseas and we don't want this technology going everywhere. And that's what we're out to do. Danny has criticized us for not keeping this out of the hands of everyone. We don't think we can do that, but we can insure that overseas this technology does not appear in every telephone in Libya. And that's what we're trying to do. Bert Wells: Well I think on that note... Bill Whitehurst: Can I respond to that briefly? Bert Wells: Well would you like to do that in your closing statement? Bill Whitehurst: Sure, I'd be happy to, thank you. Bert Wells: Thank you, because I'd like to enter the closing statements and then we will have questions from the audience. Since I have an odd number of people on that side, three instead of two, let me start on that side with Jim. Jim, could you make a closing statement? Jim Kallstrom: Sure, it'll be brief. I would just simply say that as this new technology advances, there is a different situation now facing the legal system, facing law enforcement, facing the business sector. There is a potential to have the home I talked about with the lock, that if you lose the key that you cannot ever get in. I would say that has ramifications for all of us. There's no easy solution. Clipper Chip was never designed to be a catch-all, fix- everything-for-the-rest-of-our-lives solution. It was meant to deal with a particular issue in the federal government. It has certainly got us talking about this. I think there is a lot of public policy that needs to come forth from this. The notion that technology can just go unfettered -- I think we see the results of that in things like video conferencing, picturephone -- picturephone is great until some pervert puts X-rated pictures into your house and your young children see that. Then you as a parent, and you as a society have a problem with that. Dial number recognition -- many problems with that through the United States, where people do not want, necessarily, other people knowing who they are calling. So these are not easy issues. Law enforcement has an equity here. We would have not spent the time we have, our director would have not spent the time on this issue if it was not a meaningful tool that we use quite successfully to solve major crime. If we can offer strong encryption to the citizens, and at the same time escrow keys through Clipper or any other generation of systems that come down the road, the only people that have to fear us are the criminals. And I think that's a good system. Bert Wells: Danny? Dan Weitzner: I do want to come back to my assertion that Clipper as a policy solution for all practical purposes is dead, and we now have to move on and try to find some kind of policy framework that does meet all the various needs that we've discussed here. I think that the worst thing that can happen here is nothing. Because I think that while nothing is happening, the interests of law enforcement and national security are being unfairly and unjustifiably advantaged. And while nothing is happening, people who use all sorts of electronic communications are having their privacy threatened in a very practical way. The Clinton Administration, IBM, all sorts of people in between all agree that we are moving rapidly into an arena where more and more personal information, more and more sensitive business and financial information, health records, political debates, romance and, yes, criminal activity are all happening on-line. I must say that the crime problem that I worry about most here is all of the fraud, all of the abuse, all of the harassment, all of the invasion of privacy that is allowed to go on now in absence of good security. That is the criminal problem that is being created today by what I think is a muddle of federal policies which deny people access to technology that can help them protect their privacy. Bert Wells: Stewart Baker? Stewart Baker: Well let me try to play Pollyanna here because I think in fact we have seen in an odd way this debate evolve in a favorable way. When Clipper was announced, I think there was a kind of gasp across the country among people who are in the computer business or on the Internet, a kind of belief that this was absolutely an unacceptable approach to privacy and that only a guarantee of complete cryptographic security was the right social solution. I think one ought to give credit to the Clinton Administration which, while it's shown some flexibility, has basically stuck to its guns on this issue to say "we're not going to accept a solution that simply creates a data highway that is utterly secure for any crook that wants to use it." They've stood by that position and I think for those of you who hear occasionally that the Clinton Administration doesn't stand fast, I think it's worth noting that this is an issue on which the Administration has stood quite fast. There was a lot of talk at the beginning of this debate about whether there really was a problem, whether crooks would use cyberspace in an inappropriate way, why was everybody worried. I think we have seen, that people have gotten over their first romance with the Net. Don't forget US News and World Report: "Is Anything Safe In Cyberspace", raising concerns about how people use computer networks. And, of course, as somebody who was accused once of lowering the tone of the debate, I'm pleased to see that the New York Post has joined this debate. (laughter) Nothing I say can lower the tone of the debate now. Well, I think also the idea of escrow and the concerns of the effect of encryption have slowly sunk in among the people who are going to buy encryption in the end which is businesses. Businesses have exactly the same intention in their approach to encryption as the government has. They want good security but they don't want it misused. So I suspect that as this goes forward, as the Administration sticks to the principle of key escrow but is willing to adjust the details, and as businesses begin to think about what kind of encryption they want, we're likely to see a certain amount of convergence on this issue and I suspect it will make a difference in the export control laws over the next year or two. Bert Wells: Bill Whitehurst. Bill Whitehurst: Thank you. Let me first make the point I wished to make earlier in response to Mike Nelson's point. Right now there are export controls on cryptographic products wherein the manufacturer has to go and actually get a license, in one case, through the State Department to get permission to export a device. It is extremely easy for the United States Government to relax the export control restrictions while still not allowing exports to Moammar Khadafi or the Hezbolah or any others that they might cite. This kind of regime has been in place for a long time with selective ability to export to countries that are deemed friendly countries and companies or industries in those friendly countries. The situation we have right now is that we can export to Ford of Germany but not Mercedes Benz. Mercedes Benz can buy crypto devices from any number of German manufacturers, but they can't buy it from American manufacturers. So I think there's some definite unevenness in the implementation of export controls which could greatly reduce the severity of the problem. That said, there have been many international groups that have looked at this whole problem of security in general and cryptography as a particular tool in that security and have written many papers which I'll just summarize to say that users would like freedom of choice to implement the best techniques that they can to protect their very valuable information. So flexibility of implementation, freedom of choice; but there is an overarching call that has come recently from many trade associations -- and I say, international because, for example, the International Chamber of Commerce is a public sector/private sector working relationship to try to solve this problem. I believe the private sector understands the impact of an insecure GII, and that would have severe financial and social consequences. So the call has gone out from the private sector to governments, let's cooperate, let's work together to try to solve this problem, and the frustration is that cooperation has yet to begin. I'm hopeful that it will soon, but it has yet to begin, and the highest thing on our plan is to try to get that cooperation begun. Thank you. Bert Wells: Mike Nelson. Mike Nelson: I'd just like to pick up where Stewart left off, and to point out that we have seen a convergence in the last two years, and I think you've seen the convergence tonight in that all the people on this panel would like to see the ideal solution that I outlined before, a solution that it is exportable, a solution that can be implemented in hardware and software when it doesn't rely on a classified algorithm, one that can support the needs of law enforcement, one that's easy and inexpensive to implement. We can all agree that that's where we want to go. With the Clipper Chip the Administration is taking a step towards that goal. Those who oppose the Clipper Chip have said, well, we've got a solution that meets some of those criteria, too bad about law enforcement. We think it's better to take a step-by-step approach. Clipper Chip demonstrates that you can develop a key escrow product. Now we'll look and see if we can develop a better key escrow product, one that doesn't require a classified algorithm, one that might be easily used in any country around the world. So we're moving step-by-step. We agree on where we want to go. In a way the debate is somewhat like that of arms control. We all agree that we'd like to see a world without nukes. That doesn't mean that we just unilaterally dispose of all U.S. nuclear weapons. We have to move in a step-by-step process where we can maintain the ability we have today to do wire taps, maintain all the privacy controls we have today to ensure unauthorized wire taps don't occur and move towards the situation I've described. I think we're going to see some progress this year. This administration has engaged industry. We've made a lot of progress in defining what needs to be done, what's possible. We've stimulated a lot of new thinking, and the number of patents that have appeared in the last couple of years in this area is an indication that people are looking at new approaches. In a way we've been very stupid to even engage in this issue. The Clinton Administration from Day One realized politically that this was a lose-lose issue. Either you're on the side of child pornographers or you're on the side of terrorists -- what do you want to be? But I think it's essential that we have a secure, global information infrastructure, and I think the debate has been very healthy. If you have policy being made without debate, one of two things is happening: either decisions are being made in private (that's very bad) or else no real decisions are being made, and no progress is being made. We're having an open debate, we're making progress, and I suspect if we have this meeting in another year or two, we'll be able to report that we've taken the next step, we'll have a lot more convergence even than we've seen tonight, and I think the Clinton Administration will take a lot of credit for making that happen. Bert Wells: Well, thank you very much, to all of our speakers. APPLAUSE. Moderator: I realize that it's late, but I think a couple of questions would be appropriate. If you line up behind the microphone you will have an opportunity. Could you identify yourself, please, and then state your question? Matt Blaze: Matt Blaze. I am a Cryptographer. I have a question for Mike Nelson. You referred to a tradeoff and a balance, that you're seeking this ideal, that Clipper is a first step toward a situation which the needs of industry and the needs of the global information infrastructure, whatever that is, and the needs of law enforcement all somehow come to this happy balance, and if we just think about it long enough, eventually we will find this solution. What if we don't? What if the needs of industry ultimately just are incompatible with the needs of law enforcement? Where does that leave us? Mike Nelson: I'm not a lawyer, but I was taught early on not to talk about hypotheticals. It's possible we won't find this magic solution. It's possible ... I think it's more likely that we'll find a solution that makes everybody equally unhappy, and I think that's ... Matt Blaze: I think that's the status quo. Mike Nelson: Well, no, we've got people that are really unhappy now. I think we can find one where everybody's moderately unhappy. I can't go further than that. I can't really say what happens next. I mean, we may find in a year a solution that meets all the needs, we might only go halfway, and then we might have to do something else. I don't know. I really can't speculate. Would anybody else like to .....? Moderator: All right, next question? Perry Metzger: My name is Perry Metzger. I'm a computer security expert, and I find it interesting that the panel speaks quite a lot about securing the global information infrastructure, since I'm the author of the, well, the co- author, of the latest ITF Draft on Cryptographic Security on the Internet, which incidentally uses currently non-escrowed encryption, and I have no plans of changing the document any time soon. My question is this. There are somewhere in the order of 1,000 wire taps conducted each year. It is estimated at the moment that the cost, because the National Security Agency blocked the use of cryptography in the signalling mechanisms in the cellular phone network, of cellular phone fraud in this country is around $2 billion per year. The cost of credit card fraud in this country because cryptographic mechanisms are not used to protect credit card transactions is on the order of many more billions of dollars a year. I unfortunately did not bring the latest figures with me, but I remember quite recently doing a back-of-the-envelope calculation and figuring out that we have a cost in the tens of millions of dollars for each and every one of your wire taps in terms of lost economic ... in terms of fraud committed in this country. As a result of the fact that the gentlemen on the left side of the table have been consistently blocking the implementation of good cryptography for many years now, I find it ludicrous that we are being told that this is all in the interest of law enforcement, when in fact the costs to society are so much more demonstrable from the current situation than they would be from the projected situation, and as for Moammar Khadafi and his friends, they can all download extremely good cryptographic tools from the Internet today for free. I will give you the site you can download it from. Mr. Nelson is shaking his head. I can give you a list of dozens of sites where you can download PGP and other tools. Mike Nelson: I've downloaded it as well, but he has to implement it, he has to implement it in a very good way... Perry Metzger: He doesn't have to implement it, he has to boot his PC and type "run". I mean, well, not "run." Mike Nelson: And he has to ... Perry Metzger: You type "PGP" minus "E" as to the name of the file you want to encrypt. Mike Nelson: And he has to get every banker, and every person anywhere else in the world he wants to talk to ... to do the same thing. Perry Metzger: No, no, but if he wants to communicate with terrorists ... Bert Wells: I think there should just be one question at a time, and if I may ... Perry Metzger: All right, I'm sorry. Bert Wells: If I may summarize your first question, it's really that by sacrificing the availability of good cryptography, in fact, there's a tremendous amount of crime that's going on nevertheless, perhaps of greater value, of greater law enforcement concern than the amount of crime that's being stopped by wire tapping. Stewart Baker: I couldn't agree with you more. I think the RF link on cellular ought to be encrypted. I think all those things you said ought to be encrypted. I don't see the sense of plowing back into the rate base billions and billions of dollars a year. We're not behind the reasons why that isn't done. Perry Metzger: Yes, you are. Stewart Baker: I mean that's just absolute nonsense. Perry Metzger: No, no, no. I can point you at people at Qualcolm that will explain who it was exactly at the National Security Agency that came up to them and said, "By the way, if you actually implement the cryptographic signalling mechanisms you were talking about, and you implement the cryptographic security mechanisms you were talking about, you will not ever manage to export this, and as a result it will not become a TIA standard, and your company will go down the tubes. This is not theoretical. This is a fact. Bert Wells: Stewart Baker, do you know what this question is referring to? Stewart Baker: I don't know the details of that, but I guess I have to say that if people are losing more in the United States from not having encryption than from having encryption, it's easy enough to put encryption on the export controls, and to not prevent you from using anything you want inside the United States. Jim Kallstrom: Absolutely. Stewart Baker: And real quickly, we're talking about 1,000 wire taps, but each of the cases, almost every one of those cases was a very major crime. And we were talking about, in some cases, like the World Trade Center, billions of dollars of economic impact. In cases of organized crime, _____________ closed down ... Perry Metzger: No wire taps were conducted in the World Trade Center case, were they? Stewart Baker: ... closed down, you're talking tens, even hundreds of millions of dollars, it's not as unbalanced as you say. Jim Kallstrom: You know, everything doesn't add up that way, I mean talk to the parents of the child in Richmond that was going to be killed in the snuff murder film. I'm going to go to their office and say, .... Perry Metzger: Was cryptography being used in that case? Jim Kallstrom: ... and you're going to go to them and say, "Look, this costs too much to save your child." I mean that's a nonsensical argument. Bert Wells: I appreciate all the things you're bringing to that, but we should move to the next questioner. Steve Miller: My name is Steve Miller. I used to be a computer scientist, now I'm a lawyer. I'm one of the people who's concerned about the reliability of trusting the government with the escrowed keys. Mr. Kallstrom made the excellent point that which way you view this has a lot to do with whose ox is getting gored. If you've been a victim you tend to favor the rights of law enforcement to do its job, if not you tend to be a little more circumspect about it. You brought our attention to some extreme cases: The World Trade Center, child pornography, people being kidnapped, and I think that clearly reveals that there's a legitimate law enforcement interest for you to be pursuing. Mr. Baker says that we should trust our government, and that we should put our faith in our democratic institutions. The trouble I have with that model is that he sees government as trustworthy but not the people, and I wish he would trust me rather than asking me to trust him. The reason I feel that way is that the government has a few extreme cases of its own to answer for: John Walker, Aldridge Ames, The 30th Precinct. Mr. Kallstrom, you mentioned, it might be a dumb idea for me to buy a house and not trust someone else with the key, you're right, that might be stupid. Isn't it my right to decide who gets the key to my house? Jim Kallstrom: That wasn't the point I was making. The point I was making is that it might be that the key is lost either by yourself or someone you trust it with, but you're not going to get back into the house. You can trust whoever you like. That wasn't the issue. Steve Miller: But if I haven't got the right to decide who will have that key in trust, then I've lost the right to decide whether or not anyone should have it. Maybe I would like my mother to have it, but I'm sorry, I won't give it to you. Mike Nelson: But you do have the right. I mean no one's asking you to buy the key escrow technology. And the fact is we are -- Steve Miller: But Mr. Weitzner, is that disingenuous? Mike Nelson: . . . we are exploring, we are looking with other vendors about who might want to escrow keys. But there are lots of very difficult legal issues that stand in the way. Liability being the biggest one. [stichomythia] Steve Miller: . . . whether or not that's disingenuous to suggest that I have an alternative. Dan Weitzner: Well I think -- I'm not going to try to address whether it's disingenuous as to some new future envisioned system. Certainly I think when Clipper was first announced there was a suggestion that at least the escrow agents would be, there would be one private escrow agent and one government escrow agent, and there was also the promise that those escrow agents would, none of them would be associated with law enforcement agencies, with one being in the Department of the Treasury, which also houses the Secret Service and the Internal Revenue Service. I think that there's some reason to be concerned about the location of those agents. But I think you're exactly right, I don't think there is any legal precedent, any law enforcement policy precedent, for forcing people in advance to take these kind of steps, just so that in the event that they may in the future commit a crime, they can be tracked down. There are people who have pulled out the -- you know, when it was fashionable to talk about this in highway metaphors, everyone, people on the government side would say "Well, you have to have a license to drive on the highway," etc., etc. And I think what we are in fact discovering now is that there are enormous privacy problems about the existence of Department of Motor Vehicle records which are being sold and which create very serious privacy threats. So I think that it's essential that people be able to choose who their escrow agent is. I would say to Mike Nelson that we have in our legal structure many, many kinds of escrow relationships that exist already and many models that can be looked to. What is clearly not an example of a true escrow relationship, however, is that arrangement set up under Clipper. If you pull out your Black's Law Dictionary you will find that an escrow agency relationship is defined as one of mutual obligation. The agent has obligations to the depositor, and the depositor has an obligation to, in fact, do the deposit. The problem about Clipper all along, and what very frankly has raised concerns on our part about the trustworthiness of the government, is that the government has never been willing to submit itself to any statutory framework for the management of these keys. I think in absence of that there is absolutely no reason for anyone to put their keys in that sort of database. If they choose to, fine, I can't say I would recommend it. And I think there's a legitimate question to be asked about why the government would not, when it created these escrow agents who were supposed to be trustworthy, why it would not also create a right of action against those escrow agents in the event of improper disclosure. Mike Nelson: Twelve-second rebuttal. We already have very good wiretap laws that control the ability of the government to do wiretaps. The escrow provides an extra layer of security on top of what's already there. The idea that somehow this chip is going to magically allow the government to do wiretaps whenever it pleases is at least confusing and probably disingenuous. Bert Wells: I'm sorry I may have to disappoint the last three people in line, we're just going to take two more questions because it's already after nine o'clock. So, just the next two questions please. Unidentified Audience Member: Hi. My question's short: Surely we're not the only country who is technologically advanced dealing with these issues, and I'm wondering what the European Community or Japan is using -- what policies their societies are choosing -- whether they're choosing to follow a path similar to the Clipper solution or perhaps other solutions. Bert Wells: Bill, do you think you can answer that question? Bill Whitehurst: Sure. I am not aware of any initiatives similar to Clipper (meaning the chip with the escrow keys and the whole thing) in any other country to date. There are initiatives in Japan, they have a different algorithm that their industry has been supporting. They don't have the same export control regime. They also make DES devices and sell them in Europe far more freely than United States firms. And I'll just make a note that from the European standpoint, while I mentioned France, and there are some other differences and laws in Europe, the European commission has taken up the issue of cryptography in what they call a "green paper," where their notion was to have private sector firms -- be they banks or some other trustworthy sorts of already established in the private sector -- be a "trust center," as they call it, and a keyholder as well as a place they could issue a certificate. Think of it as a certificate of authenticity or a certificate that the electronic signature was indeed proper. And so there have been those initiatives outside the United States. Mike Nelson: But it is fair to say the U.S. is far advanced both in technology development and policy development in this area. And we have certainly engaged the public, having an open debate as opposed to other countries where it's just, "We don't want to talk about it." Bill Whitehurst: I would disagree with part of that answer, Mike. There are a lot of developments in other countries and I wouldn't necessarily say that we're more advanced. We're different, definitely. Not necessarily more advanced. Mike Nelson: There are more solutions on the table. I guess is a way to put it. Bill Whitehurst: Yes. Arthur Link: My name is Arthur Link. I'm a lawyer here in New York. My question goes really to the legal issues involved in this Clipper Chip initiative, as to which I'm a little bit unclear and perhaps someone could elucidate them. What I've heard tonight is that Clipper Chip is voluntary and that there are export controls on cryptography. If it's voluntary, it implies to me nothing more than it's a product I can choose to buy or use, or choose not to buy or use. What I'd really like to know is whether, as part of this initiative, or as part of any other governmental regulation, whether it be a statute or some sort of regulation, there's a prohibition domestically on people using cryptography that is not subject to some sort of escrow scheme. Where developing it, manufacturing it, licensing it -- you know, if there isn't, then it would seem that the Clipper Chip is voluntary and would just compete in the marketplace. But if it's the only option if you want this then I question whether it's voluntary. I just would like someone to elucidate what the prohibitions are domestically. Stewart Baker: There are no restrictions domestically, legal restrictions, on the kind of cryptography you can use. That's why all of this very strong unreadable encryption is on the Internet and the like. But I think, to be fair to the discussion, you have to understand the enormous role government has played in cryptography in the past. The story of DES is a good example, that was developed by IBM and adopted by the government as a standard, and then subjected to export controls, I think IBM feels understandably aggrieved with the inconsistencies or apparent inconsistencies in policy. The reason the government's role is so significant here is that everyone knows that the government isn't going to accept anything for its own communications and its own standards that's not secure, so that the private sector can to a degree ride piggyback on the government's research and testing of these algorithms. Second, in communications (which is where cryptography's important), standards are critically important for getting things widely adopted. If I have the world's best fax machine it doesn't do me any good if everybody else is using a different standard. So having a standard tends to drive the market in communications more than other things. So that the concern I think, expressed on the civil liberties side of this, is that the government is using its significant authority over standards and its market significance to get people to adopt this, and then it will become only quasi- voluntary for, say, government employees and others. I think that's the nub of the disagreement. Dan Weitzner: I actually would agree with that assessment. And I would add that rather than looking at whether the sum of the statutes and regulations say it's voluntary or not, I really do think you have to look at the impact as Stewart said of all of these policies on the market and on the degree to which citizens -- businesses, individuals, whomever -- have access to this technology. That is the critical issue. That is the critical policy issue. In my view, the complex of all of the regulations -- including the export control laws -- combine to make it difficult to use encryption and difficult for the market to settle on standards that would make this sort of technology truly available. Stewart Baker: But if I can just add to that, that you have to see now what the real dilemma for the government is. They know that what they buy will have a major impact on what the world buys. And if they buy unreadable encryption, they are with their own purchases helping to put the FBI out of business. And that explains to a very substantial degree why they've decided to buy something that wouldn't put the FBI out of business. Dan Weitzner: I just really to take issue with that (I'm sorry to keep dragging this out). I think that it is true that the government is having a significant impact on this market. I think it is demonstrably false and that what the government chooses to buy necessarily has an impact on the market. The government, especially in the computer arena, has tried to buy a whole number of bad technologies. And it may have caused some -- I mean I can mention ADA, I can mention POSIX, I can go -- you all could probably go on at greater length. The government, I think, has rarely been successful -- I can't think of a single case where the government has been single- handedly successful ¾ at forcing the market towards a bad technology. What the government can do, is create enough confusion in the market that no one can get to a good technology. And that's what's happening here. Mike Nelson: Well, but you just made the point, and then to answer your first question, this is voluntary. The fact that government can't force people to adopt this standard means that it's all the more voluntary. And as long as this Administration is in power, it's going to be voluntary. Which is one very good reason in 1996 to vote for Bill Clinton! [laughter] Ron Abramson: I want to thank the panel and all of you for the extraordinarily high level of this debate. Thanks once again Bert. [Applause.] And thank you all for coming. Goodnight. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Attachment: Glossary Of Cryptographic Terms (Handout To Which Transcript Refers) THE CLIPPER CHIP: Should the Government Control the Master Keys to Electronic Commerce? Glossary Algorithm -- A cryptographic algorithm is a general method for scrambling communications or information for the purpose of preserving secrecy except for the intended recipient or user. Algorithms typically utilize one or more keys, which must be kept secret in order to preserve the confidentiality of encrypted information. Typical algorithms have names such as DES, RSA, RC2, and many others. Brute-Force Attack -- One form of cryptanalysis (applicable only when the algorithm is known) is called the brute-force attack, in which every possible key is applied until the information is decrypted. The difficulty of a brute- force attack depends on the number of possible keys (a number which grows exponentially with key length) and the length of time required to compute the result of decrypting with each possible key. Capstone -- Capstone is a microcircuit (chip) akin to the Clipper Chip in that it is based on the secret Skipjack algorithm (and also designed by the National Security Agency) but which is intended principally for use in secure data transmissions and for the secure storage of data rather than for voice communications. Capstone is also tamper-resistant. Clipper Chip -- The Clipper Chip is a microcircuit (chip) designed by the National Security Agency to implement the Skipjack algorithm to encrypt telephone communications. Voice and other sounds are first converted to digital format, the Skipjack algorithm is then applied, the resulting scrambled data stream is sent over the telephone line to another Clipper- equipped telephone, the receiving unit decrypts using Skipjack, and the resulting data is finally converted into voice or other sounds in the receiver. In order to protect the secrecy of the Skipjack algorithm itself, as well as certain other elements of the operation of the Clipper Chip, each chip is tamper- resistant. Cryptanalysis -- The technique of decrypting encrypted messages without use of the key. Historically, when governments were preeminent in cryptography design and usage, private parties were generally unable to design or implement cryptographic systems that could withstand governments' powers of cryptanalysis. With the advent of mathematical cryptography in the 1970's and the personal computer in the 1980's, private parties gained the ability to implement cryptography that is now generally thought to be beyond governments' powers to cryptanalyze. Digital Encryption Standard (DES) -- A computer-based standard for the encryption of data developed in the 1970's by IBM and adopted as a cryptographic standard by the Federal Government. Widely used since that time in numerous high- security commercial applications, notably banking, where DES is the standard. Although DES was considered to be highly secure and strong when introduced, many private and academic cryptography experts believe that special-purpose computers can be built (and probably have been built) to cryptanalyze DES- encrypted information rapidly based on a brute-force attack. Federal Information Processing Standard (FIPS) 185 -- Also known as the "escrowed encryption standard," FIPS 185 is the formal administrative action by the Department of Commerce that established Clipper as a standard for use by federal agencies. Key -- A key is a piece of information such as a password or a sequence of 0's and 1's that determines precisely how an algorithm encrypts or decrypts information. Knowledge of a key and the algorithm being employed permits decryption of encrypted information. Key Escrow -- A proposed form of management of keys in which certain keys would be held by either government or private institutions Law Enforcement Access Field (LEAF) -- A portion of the communications generated by the Clipper Chip that is essential to enabling law enforcement to decrypt intercepted Clipper communications. Telephone conversations on Clipper phones always consist of three elements (the first two of which are not audible or otherwise accessible to the speakers): the "key exchange protocol," transmission of the LEAF, and the voice conversation itself. The key exchange protocol is a super- secure communication (i.e., one which would probably resist government cryptanalysis) in which the two telephones agree on the choice of a new key (sometimes known as the session key) for use during the current conversation. Following the key exchange protocol, the two Clipper telephones each transmit a LEAF, which contains critically important information for law enforcement decryption. The two LEAFs contain the serial numbers of the Clipper Chips in use and encrypted versions of the session key that was identified for use during the key exchange protocol. Each individual Clipper Chip always encrypts the session key in the same way (using the "device unique key" with the Skipjack algorithm). Thus, knowing one of the serial numbers, authorized law enforcement personnel may obtain from government escrow agents a device unique key that permits decryption of the session key. Having thus obtained the session key, law enforcement personnel may subsequently decrypt a recorded, wiretapped conversation. National Institute of Standards and Technology (NIST) -- The arm of the Department of Commerce responsible for the establishment and administration of FIPS 185 and the Clipper Chip program. Together with the Automated Services Division of the U.S. Treasury, NIST serves as escrow agent for the escrowed keys for the Clipper Chip. RSA -- RSA is a cryptographic algorithm patented by its inventors and developed as a technology by a number of commercial producers of cryptographic software and hardware, including RSA Data Security, Inc. RSA is perhaps the most prominent of the systems known as "public key cryptosystems," which, among other things, permit secure encrypted com- munications between two parties who do not have a secure communications channel, and who do not otherwise have a reliable means of secretly communicating a key to each other. Skipjack -- A secret encryption algorithm designed by the National Security Agency for use in Clipper and other key escrow encryption products. Tessera (now known as Fortezza) -- Tessera is a circuit board intended for insertion in personal computers that implements the Capstone chip and its key escrow system for secure transmission and storage of data.