Printer Friendly
The Free Library
23,403,340 articles and books


Encryption, key recovery, and commercial trade secret assets: a proposed legislative model.

Encryption is not for e-mail alone, but also for digital assets which comprise a growing portion of any company's mission critical data, such as trade secrets and intellectual property. However, there are numerous problems that could arise if an employee encrypts mission critical data. What happens if that employee is no longer able to provide the key to encrypt the data? What if the employee resigns in anger and hides or destroys the keys? What if the employee dies before telling anyone where the key is? What if the employee engages in industrial espionage and provides the key to competitors? This article reviews the current technical, business, and legal concerns surrounding these issues and ends with an outline of proposed legislation to address these complex issues.

I. INTRODUCTION

Encryption and Commercial Imperatives

Key recovery is a necessary component of effective trade secret and corporate communication protection. Simply stated, key recovery is any system that allows a party other than the initial user to access the encryption key.(1) In the above scenario, the third-party is the corporation, the employee's boss. The boss needs access to the keys to safeguard against the loss of critical information.

Initially, this scenario leads to the conclusion that companies should seek key escrow systems to avoid devastating losses of information protected by strong encryption and long gone keys. This scenario, however, has begun a public debate regarding the nation's encryption, key escrow, and encryption export policies.(2) The debate generally focuses on export controls restricting the overseas availability of strong encryption as well as law enforcement interests regarding access to encrypted data both to thwart crime and terrorism and to successfully prosecute crimes.(3)

A Proposal

Private and public sector interests are divided in the key recovery debate. This Article proposes that commercial interest in key escrow has common ground with law enforcement and national interests. The public and private sector interests can be satisfied by a statutory and regulatory regime that: (1) establishes key recovery as a required feature of encryption systems used domestically in interstate commerce and encryption systems designed for export; (2) allows unlimited encryption strength; (3) allows the private sector to test, set and update the recovery technology as needed; and (4) allows on-site or third-party escrow, but does not mandate government-sanctioned off-site key escrow.

A growing sector of the business community recognizes that key recovery is essential to rational business practices. In October 1996, a small group of companies formed the Key Recovery Alliance ("KRA").(4) KRA now boasts over thirty members.(5) KRA provides detailed analyses of the business community's need for reliable key recovery systems, including scenarios for the recovery of stored data and communicated data, and scenarios that focus on the fluid nature of data, such as portability and interoperability among network or associated companies.(6) For example, KRA identifies the simple scenario of a user who has encrypted a group of files and then loses the key.(7) There are also complicated scenarios that arise from transferring data among different encryption systems, different companies, and locales with differing legal regimes.(8) Regardless of the scenario, the message of KRA's member organizations is that, "[i]nformation is a vital corporate asset, and cryptology ... has emerged as the most effective means of securing information in transmission and storage."(9) Furthermore, "key management systems ... will be essential in facilitating the growth of Global Electronic Commerce."(10)

Development of Cryptology Techniques and Laws

Once the domain of diplomats, armies, and spies seeking to protect state secrets, codes and code-breaking evoked secrecy and intrigue. Now, homemakers and entrepreneurs use encrypted messages to protect everyday gossip and common commercial missives. Since personal computers of average capability can encrypt with the near confidence of a national security agency, the veneer of intrigue revealing a world of well-protected secrets and not-so-secret secrets has dissipated.

Before the development of today's code systems, a nation's best codes were routinely broken. Germany's "Enigma" code and Japan's "JN-25" code, both capable of producing millions of possible decoding solutions, quickly became obsolete. The Allies succeeded in breaking these codes, and defeating the Axis powers in the Atlantic and the Pacific. Today, code systems can produce millions of trillions of possible decoding solutions, or more solutions than there are particles in the known universe.(11)

The worldwide distribution of this extremely strong encryption, based on 1970's research,(12) and available on the Internet since 1991 as freeware,(13) creates a dilemma of awesome dimensions. Strong encryption enables private and commercial users to pursue transactions with confidence in the security of their communications. However, the possibility of its use by criminals, terrorists, and rogue nations may force the law enforcement and national security communities to restrict its use, possibly affecting the significant confidence it now engenders.

The debate over whether strong encryption should be unfettered or controlled generates heated debate. Some believe that cryptography allows for the exercise of a right to privacy that is "at the core of American life."(14) "[A] powerhouse of economic activity and opportunity can be unleashed" with legislation that encourages the use of strong encryption techniques.(15) Others believe that "the widespread use of robust unbreakable encryption ultimately will devastate our ability to fight crime and prevent terrorism."(16)

This article examines the conflicting opinions of this debate. Part II discusses the development of strong encryption techniques and the current statutory and regulatory environment governing cryptosystems. Part III explores the competing perspectives of national security, privacy, and commerce. Part IV examines policy choices and assumptions upon which the proposal to require key recovery is based. Finally, Part V offers legislative and regulatory key recovery proposals, and addresses current legislative proposals that affect the use and distribution of strong encryption.

II. LEGAL AND REGULATORY ENVIRONMENT

A. Development of Encryption Technology

A brief examination of history and terminology is necessary to understand the recent developments in cryptology.

Terminology

"Cryptology" is the technique of concealing the contents of a message by a code.(17) "Encryption" is the use of a mathematical algorithm to transform a message into a form that is unreadable unless a decryption key is used to decode the message.(18) The recipient of the message holds the "key" which is the formula used to decode the message.(19) A "cryptosystem" is a technique to code and decode messages.(20)

Current cryptosystems come in two basic configurations. The first is a "shared single key cryptology" which requires the same key to encrypt and decrypt the message.(21) In this type of system, the security of the key is critical since the key must travel in some manner from the sender to the recipient.(22) The second is a "public key cryptography," which uses two keys; one to encrypt the message and one to decrypt it.(23) The advantage of this system is that the asymmetry of the keys prevents a person from decoding the message if either of the two keys is compromised.(24) A person's "public key" is published on the Internet or alternatively made available: anyone wishing to send that person an encrypted message uses the public key to encode the message.(25) Then the recipient uses a "private key" to decode the encrypted message.(26) For purposes of this article, "public key cryptography" and "strong encryption" will be used interchangeably.

The terms "key escrow" and "key recovery" refer to the processes of securing the private key portion of a public key system in a location that can be accessed by third parties, or having a system in place that recovers the private key.(27) The third parties are commonly either business organizations who need access to an employee's file if that employee's private key became unavailable, or law enforcement agencies acting under court order to obtain encrypted messages considered to be criminal.(28)

Finally, the quality of a public key cryptosystem depends on two factors: the mathematical algorithm and the length of the key.(29) The development of secure algorithms will be discussed shortly. The length of the key is expressed in terms of "bits."(30) Typical bit lengths discussed in the current debates include 40-bit, 56-bit, 128-bit, and 256-bit schemes.(31) When discussing key lengths, it is important to note that a 128-bit scheme, for example, is not three times more powerful than a 40-bit scheme. In fact, it is many times more powerful since the scale is exponential, not arithmetic.(32) A United States House of Representatives report noted this difference:
 To give some practical sense of the difference, one researcher estimated
 that a relatively inexpensive computer attempting a "brute force" effort to
 decode -- i.e. simply trying all the mathematical possibilities -- could on
 average decode a 40-bit scheme in a few seconds, whereas a 128-bit scheme
 would on average take millions of years.(33)


Thus, it is clear that policy choices that would fix a certain bit length for regulatory purposes have either no impact or a profound impact depending on the length of the key chosen.

History

The history of "public key cryptology" reveals that technical innovation lead to governmental angst over public access to strong encryption.(34) In the mid-1970s, two Stanford researchers, Whitfield Diffie and Martin Hellman, addressed key management questions and developed public key cryptograhpy.(35) Their system, known as RSA (the Rivest-Shamir-Aldeman algorithm developed at MIT based on Diffie and Hellman's ideas),(36) paralleled development of a governmental cryptosystem known as the Data Encryption Standard ("DES").(37) DES is a shared single key system developed by IBM in 1975 for the National Security Agency ("NSA").(38) DES has proven remarkably resistant to code-breakers and still exists as a federal information-processing standard.(39)

From the beginning, academics' pursuit of strong encryption technologies made the American national security community uneasy. For example, when an MIT professor planned to present the RSA algorithm at an academic conference, he received a letter -- signed by "J.A. Meyer," later discovered to be an NSA employee -- stating that since foreign nationals would attend the conference, disclosure of RSA would violate the International Traffic in Arms regulations.(40) Although the professor presented his paper despite the warning and without trouble ensuing, the NSA's concerns led to a system of voluntary review by the NSA of cryptography research.(41) This review process began in 1979, when NSA director Bobby Inman warned that publication of strong encryption techniques harmed national security, and continues to do so to this day.(42)

In 1984, President Reagan issued a National Security Decision Directive placing sensitive, but unclassified, information held in federal computer systems under the control of the NSA, the National Security Council, and the Department of Defense.(43) Responding to public concern over military control of federal civilian computer systems, Congress enacted the Computer Security Act of 1987 ("CSA") placing a civilian agency, the National Institute of Standards and Technology ("NIST," formerly the National Bureau of Standards), in control of civilian computing standards with the NSA in an advisory role.(44)

B. Statutory and Regulatory Environment

There are no restrictions on the sale and use of strong encryption technologies within the United States. Rather, the principal regulatory control over distribution of cryptosystems exists in the export arena.(45) Two statutes previously governed this area, but recent executive branch decisions have centralized export controls of encryption technologies under one statute.(46)

Previous Regime: State Department Enforcement Under the Arms Export Control Act

Until recently, the Arms Export Control Act of 1968(47) controlled the export of encryption algorithms and related cryptosystems. Regulations promulgated under this Act are known as the International Traffic in Arms Regulations ("ITAR").(48) The State Department enforces ITAR,(49) and articles determined to threaten the national security interests of the United States are placed on the restricted-export Munitions List.(50) Until November 1996, the State Department regulated encryption technologies as munitions.(51)

Under ITAR, the State Department required a company to obtain a license to export "[c]ryptographic software with the capability of maintaining secrecy or confidentiality of information or information systems."(52) Under this broad authority, the State Department prohibited export of most forms of strong encryption.(53) ITAR permitted nine exceptions that allowed for the export of specified cryptology technologies.(54) The most commonly used exceptions applied to cryptology used for money transactions,(55) data compression techniques,(56) and broadcast encryption used for scrambling signals.(57) These exceptions were "algorithm-neutral," which meant that the products that fell within the exceptions could be exported regardless of the strength of the encryption program.(58) Additionally, the exceptions did not apply a "foreign availability" test that allows export licenses if the encryption technology is readily available overseas or on the Internet.(59)

Current Regime: Commerce Department Enforcement Under the ExportAdministration Act

Recently, the regulations governing encryption technologies have undergone significant revisions. By an Executive Order issued November 15, 1996,(60) President Clinton transferred authority for controlling the export of encryption technologies from the State Department to the Commerce Department.(61) Accordingly, statutory authority for export controls moved from the Arms Export Control Act(62) to the Export Administration Act of 1979 ("EAA").(63)

Interim regulations, promulgated to enforce the Executive Order move the non-military encryption technologies from the Munitions List to the Commerce Control List ("CCL").(64) These regulations amend existing EAA regulations known as Export Administration Regulations ("EAR").(65) The Commerce Department summarized the interim regulations as permitting the "export and re-export of 56-bit key length DES or equivalent strength items under the authority of a License Exception, if an exporter makes satisfactory commitments to build and/or market recoverable encryption items and to help build the supporting international infrastructure."(66)

The amended EAR regulations allow licenses for: (1) mass market distribution after a one-time review if the technology incorporates 40-bit or less key lengths;(67) (2) products that incorporate key escrow and key recovery processes, provided the Bureau of Export Administration ("BXA") approves the process;(68) and (3) products that do not incorporate key escrow or key recovery processes, provided (i) no greater than 56-bit key length, and (ii) the company submits a business plan that details the steps that will be taken within two years to incorporate key recovery into all future exported products.(69) Review of all other encryption products continues under a regime similar to ITAR, where a case-by-case review is made by the BXA, with no products ever approved for certain destinations, including Cuba, Iran, Iraq, North Korea, Syria, and Sudan.(70)

The amended regulations expressly rejected a foreign availability test. In promulgating the amended EAR regulations, the BXA quoted President Clinton's Executive Order:
 I have determined that the export of encryption products described in this
 section could harm national security and foreign policy interests even
 where comparable products are or appear to be available from sources
 outside the United States, and that facts and questions concerning the
 foreign availability of such encryption products cannot be made subject to
 public disclosure or judicial review without revealing or implicating
 classified information that could harm United States national security and
 foreign policy interests. Accordingly, ... the regulations in EAR ... shall
 not be applicable with respect to export controls on such encryption
 products. Notwithstanding this, the Secretary of Commerce ... may, in his
 discretion, consider the foreign availability of comparable encryption
 products in determining whether to issue a license in a particular case or
 to remove controls on particular products, but it is not required to issue
 licenses in particular cases or to remove controls on particular products
 based on such consideration....(71)


Amended EAR Regulations

In sum, the Clinton Administration requires development of a key recovery infrastructure in order to meet its national security interests. Export licenses for 56-bit systems are predicated on developing key recovery systems. This approach seeks a middle ground between an outright export prohibition of all strong encryption products and no restriction of such exports. By allowing export of strong encryption products, if a company is committed to developing key escrow systems, the Clinton Administration hopes to enlist market forces; to export into the enormous international market, a software or hardware developer must create key escrow systems that will likely be incorporated into domestic products as well.

In mid-1996, when the administration proposed the amended regulations, some movement in the industry may have encouraged pursuit of this middle-of-the-road approach. A coalition of IBM, Apple, Digital Equipment, Hewlett-Packard, and Sun joined to create a key recovery system that would comply with the administration's proposed regulations.(72) The coalition has since grown to include such companies as Mitsubishi, Boeing, Novell, Unisys, and others, known as the Key Recovery Alliance ("KRA").(73)

Nonetheless, a common industry response is concern over the competitive implications of the amended EAR regulations.(74) The Association for Computing Machinery ("ACM"), the oldest and largest professional computer association, stated its concerns with a requirement to build a key recovery infrastructure:
 The [United States Public Policy Committee for ACM] recognizes that there
 is a real market demand for key recovery products from business and
 government employers. However, the viability of a [key recovery
 infrastructure] has not yet been determined. It has not yet been subject to
 the vigorous testing necessary for a proposed standard. There is little
 understanding of how such a system would operate and what controls would be
 needed to ensure that it remained secure. Part 740 [of the amended EAR
 regulations] describes the development of a Key Recovery Infrastructure
 within two years. We believe it is unwise for the United States to insist
 on the development of a [sic] untested, unproved technology for a worldwide
 infrastructure.... While key recovery tools may be appropriate in some
 settings, we believe it would be wrong to impose such restrictions on users
 or businesses and the Interim Rule should not dictate that businesses limit
 their research to a potentially unworkable system.(75)


Clearly, the administration's new regulations have sparked heated debate. Congress is apparently convinced that the new regulations are far from adequate.(76)

C. Congress' Proposed Reforms

In the 104th Congress, three bills addressed the removal of export controls on strong encryption technologies and products.(77) Although none of the three bills were enacted, awareness of the policy issues were raised in a number of hearings.(78) The 105th Congress continued to direct attention to the nation's encryption policy.

105th Congress and Legislative Policy Choices

Congress is concerned about encryption technologies. During the 105th Congress for example, five bills focused on key escrow and encryption export controls.(79) In addition, many bills proposed in the 105th Congress regarding telecommunications and other communications issues, also addressed encryption.(80) Table 1 compares the principal provisions of the five bills that focused on encryption policies.
Table 1. Estimated Time Needed to Recover a Single Key
Using the 250 Workstations Used by the Berkeley Student
Who Solved RSA's 40-Bit Challenge(202)

Number of Average Time Time if Key is Found
 Bits One Third of the Way
 Through the Full
 Exhaust(203)

 40 5.5 hours 3.6 hours
 56 41 years 27 years
 64 11,000 years 7,000 years
 80 690 million years 455 million years
 128 13 trillion times the 9 trillion times the
 age of the universe age of the universe


Study the Problem

One bill, House Bill 1903, passed the House on September 16, 1997.(81) House Bill 1903, the Computer Security Enhancement Act of 1997, does little more than provide for studies of encryption policies,(82) foreign encryption capabilities,(83) and public key systems,(84) Congress rarely meets a study it does not like, and this is true for House Bill 1903,(85) which presents several policy statements beyond the implementations of studies. The bill would allow the National Institute of Standards and Technology ("NIST") to assist the private sector in establishing voluntary key escrow systems, although it prohibits the NIST from setting encryption standards for non-federal systems.(86) In regards to encryption exports, the bill sets no new specific policy pending the outcome of studies, but does state a congressional finding that "[f]ederal policy for control of the export of encryption technologies should be determined in light of the public availability of comparable encryption technologies outside of the United States in order to avoid harming the competitiveness of United States computer hardware and software companies."(87) Finally, the bill states that market forces, and not government regulation, should drive the development and the use of encryption technologies.(88) The House Report accompanying the bill stated that House Bill 1903 "emphasizes the need for strong encryption, [that] [t]he widespread use of strong encryption will promote safety, security, and privacy."(89)

The Congressional support for House Bill 1903 demonstrated a political acceptance of an expansive use of encryption technologies. While House Bill 1903 mostly instituted studies of encryption technologies, it also presented some policies. The remaining bills contain substantive policy choices regarding key escrow and encryption export policies.

"General Availability" License Standard

All four substantive bills support relaxed export controls, albeit in varying degrees. One bill, introduced by Senator McCain, mirrors the Clinton administration's concern over exports and requires exports of 56-bit cryptosystems to embody key recovery mechanisms.(90) The three remaining bills adopt the "general availability" test in determining whether a license is required to export strong encryption systems. No export license is required if the software is: "(i) generally available, as is, and is designed for installation by the purchaser; or (ii) in the public domain for which copyright or other protection is not available, or is available to the public because it is generally accessible to the interested public in any form."(91) This test provides an expansive view of exports by assuming the futility of licensing encryption products that are already available to the general public.

"Similar Capability" Standard for Export Licenses

Furthermore, two of these same "general availability" bills -- the companion pieces House Bill 695, Security and Freedom Through Encryption,(92) and Senate Bill 377, Promotion of Commerce On-Line in the Digital Era ("Pro-Code") Act,(93) introduced by Representatives Goodlatte and Senator Bums respectively -- adopt a common test to determine if a grant of an export license is appropriate, assuming that the general availability test was not met. An export license is granted if the software is intended for: "(i) non-military use; [or] (ii) for use in a country where exports of software of similar capability are already allowed to financial institutions in the country."(94) A hardware license is granted if: "a product offering comparable security is commercially available outside the United States from a foreign supplier, without effective restrictions."(95) Again, the bills embody an expansive view of exports, which are allowed if software and hardware of similar capability already exist in the foreign market.

Greater Diversity in Key Escrow Proposals

In the area of key escrow, the various bills in Congress diverge from the common approaches shown with export policies. Senator McCain's bill and Senator Leahy's bill each establish a system of key recovery for law enforcement purposes.(96) While the details of each bill's mechanism vary -- McCain creates government access to all keys held by "key recovery agents"(97) while Leahy creates a system of certified "key holders"(98) -- neither bill forces this solution onto the private sector. The bills both prohibit federal or state requirements forcing a person to escrow a key with a third party.(99) Two other bills, companion bills introduced by Representative Goodlatte and Senator Bums, do not create any voluntary key escrow system, but instead simply prohibit the mandatory use of key escrow systems.(100)

Summary of Legislative Policy Choices

All of the substantive bills introduced into Congress allow for aggressive, domestic use of strong encryption regardless of encryption algorithm or key length,(101) Additionally, most of the bills allow for exports under the general availability and similar capability tests,(102) While two of these bills encourage voluntary key escrow systems with systems allowing law enforcement access, two other bills simply prohibit mandatory key escrow.(103)

Congress is not, however, walking in the near lock-step that these approaches suggest. Amendments that would weaken House Bill 695, a bill that aggressively broadens strong encryption use and relaxes export controls, were narrowly averted in House debates.(104) One amendment, pursued by Representative Michael Oxley, sought not only to criminalize the use of cryptography not approved by the Commerce Department, but also to create mandatory domestic key recovery systems,(105) A letter written to the House committee reviewing House Bill 695 and signed by numerous academic and computer trade associations bemoaned amendments weakening the legislative proposals relaxing encryption controls.(106) The letter urged Congress "to eliminate current policies that stifle the ability of researchers and implementers to study and build cryptographic algorithms, secure information systems, and secure network protocols. Otherwise, U.S. leadership in many areas of science and technology is likely to be jeopardized with no discernible benefits to our National Interests."(107) Representative Oxley's attempt to amend House Bill 695 clearly indicates that political interests exist opposing the relaxation of encryption controls.

Judicial Scrutiny

U.S. courts have recently examined the question of the constitutionality of U.S. export controls.(108) The two cases that follow are currently under appeal, so this remains an unsettled area of law. These cases demonstrate, that encryption technologies implicate First Amendment issues, and by doing so reveal the profound domestic (as compared to international and national security) interests at stake that transcend commercial interests.

Bernstein v. United States Department of State(109)

In a case currently under appeal, the United States District Court for the district of Northern California held that Export Administration Regulations ("EAR") constitute unconstitutional prior restraints on speech.(110) As such, the Commerce Department may not apply the EAR to prohibit the publication or distribution of scientific papers, algorithms, or computer programs.(111) Daniel Bernstein filed the action in this case while a Ph.D. student in cryptography at the University of California at Berkeley.(112) Bernstein originally sought determinations from the State Department regarding his plans to publish source files containing a strong encryption system and whether a related academic paper came under the jurisdiction of ITAR.(113) When the State Department determined that his cryptosystem was regulated by ITAR,(114) Bernstein subsequently initiated this action questioning the constitutionality of this regulation of his speech.(115) Bernstein argued that since ITAR makes his disclosure of certain information to "a foreign person, whether in the United States or abroad," a crime, he was therefore unable to teach or publish information about his cryptosystem without a license.(116)

The government first asserted that federal courts had no jurisdiction to hear the case.(117) Under the Arms Export Control Act ("AECA"), under which ITAR was promulgated, "the designation by the President [or others duly authorized] of items as defense articles or defense services for purposes of this section shall not be subject to judicial review."(118) The court dismissed this defense, stating that Bernstein did not seek a review of the determination that his articles were defense-related, but rather that the AECA and its regulations were unconstitutional.(119) The court then went on to examine the "speech" at issue,(120) and found that Bernstein's "paper, [is] an academic writing explaining plaintiff's scientific work in the field of cryptography, [and] is speech of the most protected kind."(121)

The source code for Bernstein's cryptosystem presented a more problematic analysis for the court. The government asserted that the source code constituted conduct, not speech, after applying the Supreme Court's test for protected speech found in Texas v. Johnson.(122) The District Court rejected this argument, however, stating that under Johnson, "a court need only assess the expressiveness of conduct in the absence of the `spoken or written word.'"(123) The court concluded that source code is written and does not fall under the conduct rubric since the written word is clearly speech for First and Fourteenth Amendment purposes.(124) Additionally, the court reasoned that copyright protection extends to source code since it contains original protection, thus supporting the conclusion that source code is speech.(125)

After finding that Bernstein's articles and source code were protected forms of speech, the court examined whether ITAR constituted a prior restraint on that speech.(126) The court stated that "[g]overnmental licensing schemes, such as the AECA and ITAR, come with a heavy presumption against their validity when they act as a prior restraint on speech."(127) Since national security concerns cannot always protect prior restraints,(128) the court concluded that Bernstein had a colorable prior restraint claim.(129) After denying the government's motion to dismiss the prior restraint claim, the District Court, in a subsequent proceeding, determined that ITAR constituted unlawful prior restraint on speech.(130) At the time of this decision, the State Department was still enforcing ITAR with respect to non-military encryption technologies.(131) Since the court's decision predated, by a matter of days, the transfer of enforcement authority from the State Department to the Commerce Department and EAR, the court revisited the questions of constitutionality under EAR.(132)

The final round of Bernstein's action to date involved transferring the previous analysis to the amended EAR.(133) The court once again found the source code to be protected speech for First Amendment purposes.(134) Since the government did not argue that ITAR was notably different from EAR, the court applied a similar analysis to that of the previous case.(135) The court applied the following test in its determination of whether a licensing scheme is constitutional:
 the licensor must make the licensing decision within a specific and
 reasonable period of time; 2) there must be prompt judicial review; and 3)
 the censor must bear the burden of going to court to uphold a licensing
 denial and once there, bears the burden of justifying denial.(136)


The court then examined EAR and found that the lack of standards by which to judge a license application and lack of judicial review made EAR "woefully inadequate."(137) Thus, the licensing program under EAR, just as the ITAR, violated Bernstein's free speech and the court consequently declared that EAR could not be used to prevent Bernstein's publication of his cryptosystem or related articles.(138)

Karn v. United States Department of State

Another case regarding the transfer of encryption control enforcement from the State Department to the Commerce Department is Karn v. United States Department of State.(139) The District Court for the District of Columbia held that the decision to designate a computer disk as controlled by ITAR was not open to judicial review and that the export limitation did not violate Karn's free speech rights.(140) Like Bernstein, this case involved diskettes containing strong encryption source code.(141) Philip Karn submitted a request to the State Department to determine if a book and associated disks fell within ITAR jurisdiction.(142) The agency determined that the book did not fall within its jurisdiction, but that the disks did.(143) Karn then exhausted his administrative appeals and eventually filed an action in district court.(144) Examining the AECA's language regarding judicial review, the court found that Congress clearly intended that designation decisions -- decisions regarding whether ITAR applies -- are not reviewable.(145) The Karn decision is consistent with Bernstein because the Bernstein court determined that the issue involved the constitutionality of ITAR, not the designation of Bernstein's source code.(146)

Next, the court examined Karn's First and Fifth Amendment claims.(147) The court applied a different test than the Bernstein court because it found that the government's interest was not related to the content of any protected speech on the disk.(148) The court utilized the O'Brien test:
 ... [I]f the regulation is content-based, the regulation will be
 presumptively invalid, whereas if the regulation is contentneutral, then
 the government may justify the regulation if certain other criteria are
 met.... These additional criteria [are] whether the regulation is (1)
 within the constitutional power of the government, (2) furthers an
 important or substantial governmental interest, and (3) is narrowly
 tailored to the governmental interest.(149)


The court concluded that the ITAR regulation was contentneutral.(150) Rather than regulating the expressive content of the disk, the court found that the regulations govern the combination of encryption source code with a machine-readable format.(151)

Thus, the government's interest in preventing "the proliferation of [cryptographic hardware and software that] will make it easier for foreign intelligence targets to deny the United States Government access to information vital to national security interests" allows government regulation of the disk.(152) Although Karn has appealed this summary judgment decision in favor of the government, the District of Columbia Court of Appeals has remanded the decision because of the changes in ITAR and EAR.(153)

III. COMPETING PERSPECTIVES

Statutory and regulatory authorities for encryption export controls reveal several competing perspectives that are seemingly at loggerheads. The Clinton Administration's regulations seek to balance national security interests with the need to have a productive domestic computer industry capable of competing internationally.(154) There is concern that these regulations will result in negative impacts on the industry.(155) Underlying all of these concerns is the concept of privacy and its central place in American society.(156)

National Security and Law Enforcement

The Clinton Administration's main theme regarding the need for key recovery systems and export controls involves national security and law enforcement.(157) FBI Director Louis J. Freeh raised the issue of crime resulting from widespread use of strong encryption:
 Unbreakable encryption will allow drug lords, spies, terrorists and even
 violent gangs to communicate about their crimes and their conspiracies with
 impunity. We will lose one of the few remaining vulnerabilities of the
 worst criminal and terrorists upon which law enforcement depends to
 successfully investigate and often prevent the worst crimes.(158)


Congressman Weldon recognized that encryption serves legitimate purposes for the American people, but he could not support:
 ... a total wiping out of any export control on technology that a cartel, a
 drug cartel, or an adversary nation has been using and could be using to
 prevent our law enforcement, intelligence, and defense resources from
 protecting the American people from the threats of drug dealing, from the
 threats of intimidation, terrorist activities, or other activities of that
 type.(159)


The need for encryption regarding counter-terrorism efforts involves law enforcement as well. FBI Director Freeh used statistics to bolster his call for encryption controls.(160) He stated that "from 1995 to 1996, there was a two-fold increase (from 5 to 12) in the number of instances where the FBI's court-authorized electronic efforts were frustrated by the use of encryption products that did not allow for lawful law enforcement decryption."(161) Additionally, Freeh asserted that the number of investigations involving 56-bit DES and 128-bit PGP encryption jumped from two percent of cases involving electronically stored information to seven percent between 1994 and 1997.(162) Freeh also offered several feasible examples, such as the case of a child pornography subject using strong encryption when transferring obscene images of children over the Internet, and the case of an international terrorist plotting to blow up eleven commercial airlines in the Far East using strong encryption to hide a terrorist plot.(163)

Academics also raise concerns over strong encryption's impact on law enforcement.(164) Dorothy Denning, Professor of Computer Science at Georgetown University, stated that a study of law enforcement professionals revealed that "we're at the leading edge of what could become a serious threat to law enforcement.... We received reports of a few cases where encryption had derailed an investigation, including two cases of intellectual property theft and one of counterfeiting."(165)

Privacy

The Office of Technology Assessment, in its congressionally mandated report entitled "Information Security and Privacy in Network Environments," defines "privacy" as the "balance struck by society between an individual's right to keep information confidential and the societal benefit derived from sharing the information, and how that balance is codified into legislation giving individuals the means to control information about themselves."(166) For others, "privacy" can mean "the right to be left alone."(167) While the Supreme Court has recognized a right to privacy in decisions involving such issues as contraception,(168) it is unclear how far this "right to be left alone" extends.

In the context of the encryption debate, congressional studies,(169) commentators,(170) and members of Congress make privacy a central element of the debate. For instance, Senator John Kerry stated that "[e]ncryption protects privacy and security."(171) A House of Representatives report elaborated on this connection between encryption and privacy:
 The computer industry, the American business community, and privacy groups
 vehemently oppose any mandatory key escrow system. They argue that a
 mandatory system would unnecessarily invade the privacy of users and that
 the market should develop any voluntary key escrow system. They believe
 that law enforcement can gain access to keys through traditional means for
 obtaining evidence and that those with criminal intent will not use key
 escrow systems, thus defeating the purpose of the Administration's policy.
 They argue that our law and tradition do not require private citizens to
 take positive action to assist the government in surveilling them in any
 other instance.

 Moreover, they contend that private citizens should not be required to
 give access to their most precious assets to anyone else regardless of
 whether it is the government or a third party.... They further argue that
 the good that widespread use of encryption can do in preventing crime far
 outweighs the harm done by a relatively few instances in which the use of
 encryption hampers law enforcement.(172)


Commerce and "Commercial Imperative"

In 1991, a National Research Council report identified a "commercial imperative"(173) for reducing export controls on encryption technologies.(174) The report identified the computer industries' concerns about their diminishing competitiveness in overseas markets because of the current regime of export restrictions.(175) With export restrictions on DES and RSA technologies, the report stated that "foreign consumers and, more importantly, large multinational consumers will simply purchase equivalent systems from foreign manufacturers."(176) This shift to foreign manufacturers is already taking place. Foreign companies and foreign subsidiaries of United States based companies are manufacturing strong encryption products overseas to avoid United States export controls. For example, RSA Data Security, Inc., the holder of the United States patent on RSA technology,(177) is producing strong encryption products in China,(178) and the Apache Group in England and Nippon Telephone and Telegraph in Japan are both exporting 128-bit cryptosystems.(179)

The potential economic impact is considerable. Congress estimates that if the current U.S. export policy is pursued through the year 2000, it will cost the U.S. economy 200,000 high-paying jobs and $60 billion each year.(180) Not only is the encryption market threatened, but American CEOs also warn that banking, telecommunications, pharmaceutical, and other sectors that are heavily dependent on data security systems will also be affected.(181) A recent National Research Council report(182) attempted to dispel the notion that industry's interests are irreconcilably at odds with law enforcement interests.(183) The co-author of the report, and chair of the NRC commission on cryptography, Kenneth Dam, stated that:
 ... we came to the conclusion that this picture of law enforcement and
 national security competing against privacy and business needs for
 confidentiality was an incomplete picture. After all, protecting a
 company's proprietary information against industrial spies is very much a
 part of law enforcement.

 Protecting critical national information systems and networks against
 unauthorized intruders is a key responsibility of national security....
 Thus -- and this is a very important point, and I think it was borne out by
 the testimony [before the Senate Judiciary Committee] earlier this morning
 -- the use of cryptology can actually help law enforcement and national
 security, as well, of course, as hindering them.(184)


Dam concluded that a "national policy on cryptology that runs counter to user needs and against market forces is unlikely to be successful over the long term."(185)

Market forces are at odds with current encryption policy. As the following legislation proposals illustrate, market forces compel most of the proposed changes in national encryption policy.

Encryption Software

The current debate is controversial, in part, because strong encryption has been available as freeware and shareware over the Internet since 1991.(186) In 1983, the Massachusetts Institute of Technology ("MIT") received a patent on RSA technology that expires in the year 2000.(187) MIT gave an exclusive commercial license to a group called Public Key Partners, now RSA Data Security, Inc., to sell and sublicense the RSA cryptosystem.(188) Philip Zimmermann, chairman of PGP, Inc. (Pretty Good Privacy), took the RSA technology and developed PGP software that he (or fellow encryption colleagues) distributed over the World Wide Web as freeware.(189) PGP software has since spread throughout the world and, as Zimmermann states, has become the "de facto worldwide standard for encryption of e-mail."(190) Now, PGP markets its products for profit. For example, "PGP for Personal Privacy" and "PGP for Business Security" which both contain export-controlled strong encryption are sold on-line for fifty-nine dollars and one hundred and nineteen dollars respectively.(191) The products offer to "integrate seamlessly"(192) into popular e-mail systems and provide "the ultimate protection" for business data files and electronic correspondence.(193)

PGP has not monopolized the market. Competitors such as Netscape Messenger(194) and Microsoft Outlook(195) provide strong encryption packages for e-mail and data files.(196) The emergence of these "point-and-click" encryption programs fuels the debate because they make encryption a matter of a single click of the mouse (similar to a "spell-checker"), thus encouraging its use for every electronic transaction or correspondence.(197)

Code-breaking

The encryption debate also poses the question of whether strong encryption applications such as RSA and DES can be broken. The extent of trust placed in encryption will directly reflect the business and government communities' evaluation of a program's ability to withstand attack. RSA, Inc., recently challenged hackers and others to break a 40-bit PGP-encrypted message.(198) A University of California at Berkeley student broke the key and deciphered the message using 250 workstations tied together for a brute force attack.(199) The 250 computers broke the code in 3.6 hours.(200) The National Security Agency ("NSA") used this information to explain that, in comparison to the 40-bit technology, the 56-bit technology was virtually unbreakable.(201) NSA Deputy Director William Crowell presented the following table at a Senate hearing on encryption technology.

Although these are prodigious figures and would convince most people that 56-bit technology (the level allowed for export under EAR, if a key escrow plan is in place) is sufficient for most purposes, this confidence may be misplaced.

Philip Zimmermann testified that Northern Telecom of Canada engineers developed a special chip to crack 56-bit DES codes.(204) These chips, if linked with 50,000 similar chips at a cost of $1 million, could try every 56-bit DES key in seven hours.(205) For a $10 million investment that time could be reduced to twenty-one minutes, and for $100 million, just two minutes.(206) Furthermore, Zimmermann made the point that NSA resources could probably reduce that time to a few seconds.(207)

IV. POLICY PARAMETERS

Concerns regarding the vulnerability of the nation's commercial digital assets have prompted congressional action. In 1996, Congress enacted the Economic Espionage Act making it a federal crime to steal or misappropriate trade secrets.(208) Upon signing the Act, President Clinton stated that "[t]rade secrets are an integral part of virtually every sector of our economy and are essential to maintaining the health and competitiveness of critical industries operating in the United States. Economic espionage and trade secret theft threaten our Nation's national security and economic well-being."(209)

Congress has thus addressed trade secrets in a generic format by increasing penalties for certain trade secret crimes. The subset of digital assets has yet to attract a consensus approach. Although the 105th Congress is the third Congress to address encryption policy, as of yet no bills have been enacted into law. The issue's complexity and the pressures of those who seek amendments weakening the use of strong encryption have so far frustrated efforts to enact reform legislation.

Competing Interests

Which interests have the upper hand? The current encryption export regulations indicate that the specter of drug lords and international terrorists running amok with impervious cryptosystems holds sway with the FBI and the Commerce Department. The similarities among the current legislative proposals indicate, however, that national security interests are losing out to commercial and privacy interests in the halls of Congress.

Thus, from one end of Pennsylvania Avenue to the other, encryption policies are worlds apart. Congressional pragmatists recognize that the general availability of similarly capable cryptosystems overseas nullifies the effectiveness of unilateral attempts to impose key recovery on strong encryption. Likewise, the pragmatists recognize that mandatory key escrow systems, or even the shadow of voluntary systems regulated by the federal government, inhibits commercial enterprise by significantly lowering the confidence that exists in any given security arrangement, whether for banking, trade secret protection, or other routine commercial communications.

Regardless of whether or not the FBI is correct in believing that terrorists and others will use strong encryption, the proverbial genie is out of the bottle. If the NSA's contention that key lengths of 64-bits or longer can take tens of thousands or millions of years to decode is to be believed,(210) then the FBI's vision of terrors cannot be mitigated. If the FBI and NSA know that decoding machines may decrypt 64-bit messages in a matter of seconds or minutes, as Philip Zimmermann purports,(211) then not only are the FBI and NSA stirring up tales of rampant criminal acts, but they are also remarkably disingenuous.

A Marketplace for Encryption

Commercial interests are divided on the effectiveness of key recovery systems and export controls. On the one hand, with billions of dollars and hundreds of thousands of jobs at stake, the business interests have a compelling argument for relaxing export controls and avoiding key escrow systems. Without these concessions, current administration policies may create a "digital Berlin wall" that could seriously impact the nation's robust computer and hardware industries.(212) Yet it is not only those high technology industries that suffer, for a weakened technology sector affects all industries, since each firm must begin (or continue to enhance) its security to protect digital assets. Encryption technology with appropriate key recovery systems is the cornerstone in the protection of commercial digital assets.

Representatives of major U.S. computer companies have spoken about the international scope of the business perspective:
 The [global information infrastructure ("GII")] will not flourish without
 effective security mechanisms to protect commercial transactions. Consumers
 and providers of products and services, particularly those involving health
 care and international commerce, will not use GII applications unless they
 are confident that electronic communications and transactions will be
 confidential, that the origin of the messages can be verified, that
 personal privacy can be protected, and that security mechanisms will not
 impede the transnational flow of electronic data.(213)


The "commercial imperative"(214) for strong encryption thus attracts global interests. However, commercial needs may drive a market for voluntary key recovery products and services. When a company loses an employee to a job transfer, retirement, or less benign circumstances, a company may lose mission critical information if it is secured with a key that becomes lost or inaccessible. A private marketplace may develop that would allow employers to retrieve keys. These keys would then be accessible to law enforcement officers under standard subpoena and warrant methods. This market will develop only if concerns about security can be addressed regarding access to these escrowed keys, but this potential market does suggest that some of the administration's concerns regarding key escrow may be addressed through market forces.

Policy Choices

The Key Recovery Alliance ("KRA") strongly believes that "government-designed or imposed encryption technology solutions will inevitably fail to win marketplace acceptance."(215) The KRA is apparently reacting to the policy spectrum of FBI-supported mandated weak encryption and mandatory key escrow, failed attempts at Clipper chips, and legislative proposals. There is, however, a middle ground that belies the false dichotomy of government versus business-driven legislative proposals.

First, the "doomsday" scenarios presented by the FBI, evoking images of terrorists running amok with impenetrably-encrypted plans, may only obscure the legitimate needs for law enforcement access to encrypted data. If a judicially-supervised process is used by law enforcement agencies to obtain keys (with appropriate Fourth Amendment safeguards, including judicial supervision, probable cause, and due process), the same protection would inhere to keys that currently protect any commercial document. The law enforcement hurdle is having a key which is amenable to direct warrant or subpoena process. Some bills, like those proposed by Senators McCain and Leahy,(216) require keys to be held by government-sanctioned organizations. However, government-authorized key escrows raise issues of "trusted third part[ies]."(217) Civil libertarians and the business community have argued that the government (or government-sanctioned entities) are not necessarily appropriate trusted third parties.(218) Privacy issues may be mitigated, however, if legislation encourages private sector key recovery mechanisms, including voluntary trusted third party and in-house mechanisms that involve no outside parties. Second, the growing recognition that key recovery is essential to healthy commerce supports a public policy that allows key recovery systems to flourish. Numerous scenarios of lost mission critical information support an aggressive marketplace for key recovery mechanisms.

Finally, it is possible to balance the legitimate "what ifs" of national security and enforcement -- including the high stakes scenarios of terrorist actions -- with the desire of the business sector to have reliable key recovery. A public policy that requires a key recovery system, but allows the user to select on-site or trusted third party storage meets the needs of key recovery while also allowing for Fourth Amendment restrained access to keys when criminal activity is afoot. While on-site escrow may be more difficult for law enforcement to access, this protection of private records provides an appropriate balance to the otherwise unhindered ability for government to access the critical documents of commerce and private sources.

V. PROPOSAL FOR KEY RECOVERY

This proposal for legislative action relies upon the premise that technological innovations will continue at a steady pace. However, the future design of encryption key recovery systems is highly speculative, if not unknowable. Furthermore, current strong encryption technologies exist worldwide through easy Internet access. As a result, current administration policy restricting 56-bit and greater encryption places the domestic computer at a competitive disadvantage vis-a-vis unrestrained competitors (or unrestrained foreign subsidiaries). Underlying these assumptions is the convergence of commercial, law enforcement, and national security interests that recognize the utility of key recovery systems. While those concerned with privacy interests are understandably wary of government-controlled mandatory key escrow systems, a voluntary and private sector driven key recovery mechanism balances privacy interests with the legitimate requirements of commercial enterprises and law enforcement.

Legislative Content

Commercial, law enforcement, and privacy interests can be properly balanced by the enactment of legislation requiring the computer industry to imbed or integrate key recovery mechanisms in all encryption systems developed for use in interstate commerce and for export. Specifically, key recovery legislation should address two broad issues: (1) requirements for key recovery mechanisms within the domestic market; and (2) requirements for encryption products designed for export.

First, any domestic strong encryption product must contain a key recovery system. This requirement must be based on a generic reading of what constitutes an encryption product such that both imbedded (over-the-counter products) and integrated (custom designed systems) encryption mechanisms are covered. Additionally, the requirement must cover both the transmission and storage of data because the line dividing these two states of information is blurred by such innovations as intranets and off-site secure storage facilities. Second, keys may be held by the encryption user or any third party, since government mandated key agents would interfere with the public's trust of the key recovery system. Moreover, no specific technologies should be mandated, as this mandate would hinder the innovations that will surely continue to shape encryption and decryption capabilities. Third, neither the federal nor any state government should set encryption or key recovery standards to be used by the private sector. If the private sector is allowed to set standards and innovate freely, there will be confidence in encryption and key recovery mechanisms, thereby allowing these technologies to keep pace with the general technology sector.

The export of strong encryption requires a regime similar to the Clinton administration's current policies: key recovery must be an integral element of any strong encryption product destined for export. As in the domestic marketplace, the private sector would develop the key recovery standards and technologies free from government mandates. The American computer industry can benefit by having an algorithm-neutral regime in which key length is not regulated, thus leveling the international playing field with those companies that currently provide greater-than-56-bit key lengths. Finally, countries that support international terrorism or threaten the national interests of the United States should be denied exports.

Executive Actions

Executive actions required in implementing the legislative policies fall into two broad categories. First, the Commerce Department should propose standard definitions to enable the private sector to clearly articulate the standards it sets for encryption and key recovery products. A de minimis amount of encryption strength, would, by administrative decree, be completely unregulated, such as allowing non-key recoverable products with less than 40-bit key lengths. Second, standards for federal government cryptosystems and key recovery mechanisms should be developed by the National Institute of Standards and Technology ("NIST"). In doing so, a voluntary public-private association to test key recovery tools could be encouraged to develop and disseminate knowledge regarding key recovery systems.

VI. CONCLUSION

Mixed signals are the order of the day in the encryption policy debate. Congress and the Commerce Department choose different interests; while Congress listens to the pragmatists who desire relaxed controls, the administration fears barbarians at the gate. Since the private sector wants strong encryption to secure its commercial and privacy interests, Congress must take action amending the law to require domestic key recovery systems. Commercial imperatives to both protect and recover mission critical information thrive in an already competitive marketplace for key recovery mechanisms. National security interests and law enforcement realities require the ability to decrypt messages thought to contain criminal or adverse national security content. However, this ability should extend no further than the current search and seizure scope allowed under the Fourth Amendment, thus protection privacy interests. The balance of interests, therefore, weighs heavily in favor of the mandatory inclusion of key recovery mechanisms in both domestic and export cryptosystems, with relatively unrestricted export of cybersystems that contain imbedded key recovery systems.

(1.) See discussion infra Part II(A)(1).

(2.) See OFFICE OF TECHNOLOGY ASSESSMENT, U.S. CONGRESS, ISSUE UPDATE ON INFORMATION SECURITY AND PRIVACY IN NETWORK ENVIRONMENTS 84-88 (1995) (describing the U.S. business communities' efforts to influence administrative and legislative policies toward encryption) [hereinafter ISSUE UPDATE].

(3.) See discussion infra Part III(A).

(4.) See James B. Altman & William McGlone, Demystifying U.S. Encryption Export Controls, 46 AM. U. L. REV. 493, 502 (1996); Key Recovery Alliance, Frequently Asked Questions About the Key Recover), Alliance (updated June 1998) <http://www.kra.org/KRAFAQ-1209.html> [hereinafter KRA].

(5.) See KRA supra note 4. Among the members are Apple Computer, Boeing Co., Digital Equipment Corp., Fujitsu Corp., Hewlett-Packard Co., Hitachi, IBM Corp., Mitsubishi Electric Corp., Motorola, Inc., NCR Corp., NEC Corp., and Toshiba Corp. See Key Recovery Alliance, Membership Roster (visited Apr. 5, 1998) <http://www.kra.org/roster3.html>.

(6.) See Key Recovery Alliance, Business Requirements for Key Recovery (visited Jan. 21, 1998) <http://www.kra.org/whitepapers/SCEN3-0_DOC_1.pdf>.

(7.) See id. at 5.

(8.) See id. at l0-11,15.

(9.) Key Recovery Alliance, Cryptographic Information Recovery Using Key Recovery 1 (visited Mar. 27, 1998) <http://www.kra.org/whitepapers/Crypto VI_2.pdf>.

(10.) Key Recovery Alliance, Key Recovery and Electronic Commerce: Industry's Efforts to Develop New Tools to Support Strong Encryption 6 (visited Mar. 27, 1998) <http://www.kra.org/whitepapers/KRACommerce.pdf>.

(11.) See SECURITY AND FREEDOM THROUGH ENCRYPTION (SAFE) ACT, H.R. REP. NO. 105-108, pt. 1. At 28 (1997).

(12.) See ASSOCIATION FOR COMPUTING MACHINERY, CODES, KEYS AND CONFLICTS: ISSUES IN U.S. CRYPTO POLICY ch. 6 (1994) (visited Oct. 9, 1998) <http://info.acm.org/reports/acm_crypto_study.html > [hereinafter ACM, CODES].

(13.) See S. 1726, Promotion of Commerce Online in the Digital Era Act of 1996, or "Pro-Code" Act: Hearing Before the Subcomm. on Science, Tech., and Space of the Senate Comm. on Commerce, Science, and Transp., 104th Cong. (1996) (testimony of Phillip R. Zimmermann, Chairman and Chief Technology Officer, Pretty Good Privacy, Inc.) [hereinafter S. 1726, "Pro-Code" Act].

(14.) ACM, CODES, supra note 12, at ch. 5.

(15.) 143 CONG. REC. S4684-02 (daily ed. May 19, 1997) (statement of Sen. John Kerry).

(16.) The Encryption Debate: Criminals, Terrorists, and the Sec. Needs of Business and Indus., Hearing Before the Subcomm. on Tech., Terrorism and Gov't Info. of the Senate Comm. on the Judiciary, 105th Cong. (1997) (testimony of Louis J. Freeh, director, Federal Bureau of Investigation).

(17.) See Issue UPDATE, supra note 2, at 113.

(18.) See William A. Tanenbaum, Computer Security and Encryption FAQ, 14 THE COMPUTER LAWYER 19 (1997).

(19.) See Security AND FREEDOM THROUGH ENCRYPTION (SAFE) ACT, H.R. REP. NO. 105-108, at 5 (1997).

(20.) See Tanenbaum, supra note 18, at 19.

(21.) See id. at 20.

(22.) See id.

(23.) See id.

(24.) See id.

(25.) See id.

(26.) See id.

(27.) See Key Recovery Alliance, Cryptographic Information Recovery Using Key Recovery 2-3 (visited Mar. 27, 1998) <http://www.kra.org/whitepapers/ CryptoV1_2.pdf>.

(28.) See id. at 3.

(29.) See SECURITY AND FREEDOM THROUGH ENCRYPTION (SAFE) ACT, H.R, REP. NO. 105-108, at 5 (1997).

(30.) See id.

(31.) See id. at 5, 26.

(32.) See id. at 5.

(33.) Id.

(34.) See ACM, CODES, supra note 12 at ch. 6.

(35.) See id.

(36.) See id. at ch. 1.

(37.) See id.

(38.) See id.

(39.) See id.

(40.) See id. at ch. 6.

(41.) See id.

(42.) See id.

(43.) See id.

(44.) See id.

(45.) See 15 C.F.R. [subsections] 740, 742 (1998).

(46.) See 22 U.S.C. [subsections] 2751-99 (1994); see also 22 C.F.R. [subsections] 120.1-130 (1996).

(47.) 22 U.S.C. [subsections] 2751-99 (1994). The statute states:
 In furtherance of world peace and the security and foreign policy of the
 United States, the President is authorized to control the import and the
 export of defense articles and defense services and to provide foreign
 policy guidance to persons of the United States involved in the export and
 import of such articles and services. The President is authorized to
 designate those items which shall be considered as defense articles and
 defense services for the purposes of this section and to promulgate
 regulations for the import and export of such articles and services. The
 items so designated shall constitute the United States Munitions List.


Id. at [sections] 2778(a)(1).

(48.) See 22 C.F.R. [subsections] 120.1-130.

(49.) See id. at [sections] 120.1.

(50.) See id. at [sections] 120.2.

(51.) See Exec. Order No. 13,026, 61 Fed. Reg. 58767 (1996).

(52.) 22 C.F.R. [sections] 121.1 cat. XIII(b), repealed by Removal of Commercial Communications Satellites and Hot Section Technology from State's USML for Transfer to Commerce's CCL, 61 Fed. Reg. 56,895 (1996).

(53.) See Nicholas W. Allard & David A. Kass, Law and Order in Cyberspace: Washington Report, 19 HASTINGS COMM. & ENT. L. J. 563,575 (1997).

(54.) See 22 C.F.R. [sections] 121.1 cat. XIII(b)(i)-(ix), repealed by Removal of Commercial Communications Satellites and Hot Section Technology from State's USML for Transfer to Commerce's CCL, 61 Fed. Reg. 56,895 (1996).

(55.) See id. at [sections] 121.1 cat. XIII(b)(i)-(ii).

(56.) See id. at [sections] 121.1 cat. XIII(b)(vi).

(57.) See id. at [sections] 121.1 cat. XIII(b)(viii). For more information on these exceptions, see Altman & McGlone, supra note 4, at 499-500.

(58.) See Altman& McGlone, supra note 4, at 501.

(59.) See id.

(60.) See Exec. Order No. 13,026, 61 Fed. Reg. 58,767 (1996).

(61.) See id.

(62.) See 22 U.S.C. [sections] 2795 et seq.

(63.) See 50 U.S.C. [sections] 2406 (1994).

(64.) See Licensing of Key Escrow Encryption Equipment and Software, 61 Fed. Reg. 65,462 (1996) (to be codified at scattered portions of 15 C.F.R. pts. 740, 742, 834).

(65.) See 15 C.F.R. [subsections] 730-774 (1996).

(66.) Encryption Items Transferred from the U.S. Munitions List to the Commerce Control List, 61 Fed. Reg. 68,572-573 (1996).

(67.) See 15 C.F.R. [sections] 742.15(b)(1) (1998).

(68.) See id. at [sections] 742.15(b)(2).

(69.) See id. at [sections] 742.15(b)(3).

(70.) See id. at [sections] 742.15(b)(4).

(71.) Exec. Order No. 13,026, 61 Fed. Reg. 58,767 (1996).

(72.) See Altman & McGlone, supra note 4, at 503.

(73.) See Hearing on S.377 "Pro-Code" Before the Senate Comm. on Commerce (Mar. 19, 1997) (testimony of William P. Crowell, deputy director, National Security Agency).

(74.) See, e.g., Letter from the United States Public Policy Committee for ACM, to Nancy Crowe, Regulatory Policy Division, Bureau of Export Administration, Department of Commerce (Feb. 12, 1997).

(75.) Id.

(76.) See discussion infra Part II(C).

(77.) See Altman & McGlone, supra note 4, at 506.

(78.) See, e.g., S. 1726, "Pro-Code" Act, supra note 13.

(79.) See COMPUTER SECURITY ENHANCEMENT ACT OF 1997, H.R. 1903, 105th Cong. (1997); SECURITY AND FREEDOM THROUGH ENCRYPTION (SAFE) ACT, H.R. 695, 105th Cong. (1997); PROMOTION OF COMMERCE ON-LINE IN THE DIGITAL ERA (PRO-CODE) ACT OF 1997, S.377, 105th Cong. (1997); ENCRYPTED COMMUNICATIONS PRIVACY ACT OF 1997, S.376, 105th Cong. (1997); SECURE PUBLIC NETWORKS ACT, S.909, 105th Cong. (1997).

(80.) See, e.g., COMMUNICATIONS PRIVACY AND CONSUMER EMPOWERMENT ACT, H.R. 1964, 105th Cong. [sections] 203 (1997) (prohibiting the federal government or a state government from restricting or regulating the interstate commerce of encryption products or establishing an authentication system for key escrow).

(81.) See 143 CONG. REC. H7293-03 (daily ed. Sept. 16, 1997).

(82.) See H.R. 1903, at [subsections] 6, 11.

(83.) See id. at [sections] 7.

(84.) See id. at [sections] 11.

(85.) See id.

(86.) See id. at [subsections] 3, 7.

(87.) Id. at [sections] 2(a)(5).

(88.) See id. at [sections] 2(a)(4).

(89.) H.R. REP. No. 105-243, at 6 (1997).

(90.) See SECURE PUBLIC NETWORKS ACT, S.909, 105th Cong. [sections] 302 (1997).

(91.) SECURITY AND FREEDOM THROUGH ENCRYPTION (SAFE) ACT, H.R. 695, 105th Cong. (1997); PROMOTION OF COMMERCE ON-LINE IN THE DIGITAL ERA (PROCODE) ACT OF 1997, S.377, 105th Cong. (1997); ENCRYPTED COMMUNICATIONS PRIVACY ACT OF 1997, S.376, 105th Cong. (1997).

(92.) See H.R. 695.

(93.) See S. 377.

(94.) H.R. 695; S. 377.

(95.) H.R. 695; S. 377.

(96.) See SECURE PUBLIC NETWORKS ACT, S. 909, 105th Cong. (1997); ENCRYPTED COMMUNICATIONS PRIVACY ACT OF 1997, S. 376, 105th Cong. (1997).

(97.) See S. 909.

(98.) See S. 376.

(99.) See S. 909; S. 376.

(100.) See SECURITY AND FREEDOM THROUGH ENCRYPTION (SAFE) ACT, H.R. 695, 105th Cong. (1997); PROMOTION OF COMMERCE ON-LINE IN THE DIGITAL ERA ("PRO-CODE") ACT OF 1997, S. 377, 105th Cong. (1997).

(101.) See supra, note 79 and accompanying text.

(102.) See id.

(103.) See supra, notes 93-96 and accompanying text.

(104.) See Letter to Chairman Bliley, U.S. House of Representatives, from the American Association for the Advancement of Science and 13 other associations and individuals, dated Sept. 24, 1997 (visited Nov. 28, 1997) <http://www.acm.org/usacm/crypto/societies_crypto_letter 1997.html>.

(105.) See id.

(106.) See id.

(107.) Id.

(108.) Beyond the First Amendment issues implicated by encryption technologies, the technology itself is subject to litigation to determine its rightful ownership. The dispute is between RSA Data Security, Inc., (RSADS) and PGP, Inc. RSADS filed suit in California Superior Court on May 5, 1997, alleging that PGP violated its licensing agreement for its RSA encryption algorithms. See PGP Wins Motion to Arbitrate, RSA Insists License is Violated, ELECTRONIC MESSAGING NEWS, OCt. 15, 1997. On October 8th, the court granted PGP's motion to send the dispute to arbitration. See id. Despite the likelihood of this dispute's settlement behind closed doors, the lawsuit reveals the tensions among competitors in the marketplace. RSADS's patent on RSA technology expires in the year 2000, and it clearly seeks to gain as much royalty revenue from PGP as possible before the technology enters the public domain. See id.

(109.) 974 F. Supp. 1288 (N.D. Cal. 1997).

(110.) See id. at 1310.

(111.) See id.

(112.) See id. at 1292.

(113.) See Bernstein v. United States Dep't of State, 922 F. Supp. 1426, 1429 (N.D. Cal. 1996).

(114.) See id.

(115.) See id.

(116.) Id. (quoting language from ITAR which has now been superseded by the new EAR regulations).

(117.) See Bernstein, 922 F. Supp. at 1431.

(118.) Id. (citing 22 U.S.C. [sections] 2778(h)).

(119.) See Bernstein, 922 F. Supp. at 1431.

(120.) See id.

(121.) Bernstein, 922 F. Supp. at 1434 (citing Sweezy v. New Hampshire, 354 U.S. 234, 249-50 (1957) (stating the importance of the protection of academic works)).

(122.) See Bernstein, 922 F. Supp. at 1434 (citing Texas v. Johnson, 491 U.S. 397 (1989)).

(123.) Bernstein, 922 F. Supp. at 1434 (quoting Johnson, 491 U.S. at 404).

(124.) See Bernstein, 922 F. Supp. at 1434-35. The fact that source code is essentially unreadable did not persuade the court that it was not speech. See id. at 1435. Citing the Ninth Circuit's decision in Yniguez v. Arizonans for Official English, the court stated that "the choice to use any given language may often simply be based on a pragmatic desire to convey information to someone so that they may understand it." Id. (quoting Yniguez v. Arizonans for Official English, 69 F.3d 920, 935 (9th Cir. 1995)).

(125.) See Bernstein, 922 F. Supp. at 1436.

(126.) See id. at 1437-38.

(127.) Id. at 1438 (citing Nebraska Press Ass'n v. Stuart, 427 U.S. 539 (1976)).

(128.) See Bernstein, 922 F. Supp. at 1438 (citing New York Times Co. v. United States, 403 U.S. 713 (1971), in which publication of the classified "Pentagon papers" was upheld).

(129.) See Bernstein, 922 F. Supp. at 1438.

(130.) See Bernstein v. United States Dep't of State, 945 F. Supp. 1279, 1291-92 (N.D. Cal. 1996).

(131.) See Bernstein v. United States Dep't of State, 974 F. Supp. 1288, 1288-89 (N.D. Cal. 1997) [hereinafter Bernstein II].

(132.) See id.

(133.) See id. at 1293-96.

(134.) See id. at 1308.

(135.) See id. at 1304.

(136.) Id. at 1305 (citing FW/PBS, Inc. v. Dallas, 493 U.S. 215, 227-28 (1990)).

(137.) Bernstein II, 974 F. Supp. at 1308.

(138.) See id. at 1309-10.

(139.) See 925 F. Supp. 1 (D.D.C. 1996). The District of Columbia Court of Appeals remanded the case to the District Court as a result of the transfer of authority from ITAR to EAR. See Karn v. United States Dep't of State, 107 F.3d 923 (D.C. Cir. 1997).

(140.) See Karn, 925 F. Supp. at 6-7, 10-11.

(141.) See id. at 2-3.

(142.) See id. at 3.

(143.) See id.

(144.) See id. at 4.

(145.) See id. at 6-8.

(146.) See Bernstein v. United States Dep't of State, 922 F. Supp. 1426, 1431 (N.D. Cal. 1997).

(147.) See Karn, 925 F. Supp. at 8-13.

(148.) See id. at 9.

(149.) Id. at 10 (citing United States v. O'Brien, 391 U.S. 367 (1968)).

(150.) See Karn, 925 F. Supp. at 10.

(151.) See id.

(152.) Id. at 10-11.

(153.) See Karn v. United States Dep't of State, 107 F.3d 923 (D.C. Cir. 1997).

(154.) See discussion infra Part III(A).

(155.) See infra note 157 and accompanying text.

(156.) See discussion infra Part III(B).

(157.) See The Encryption Debate, supra note 16 (testimony of Louis J. Freeh).

(158.) Id.

(159.) 143 CONG. REC. H6909-01 (daily ed. Sept. 4, 1997) (statement of Congressman Weldon regarding H.R. 695).

(160.) See The Encryption Debate, supra note 16 (testimony of Louis J. Freeh).

(161.) Id.

(162.) See id.

(163.) See id.

(164.) See, e.g., The Encryption Debate: Criminals, Terrorists, and the Sec. Needs of Bus. and Indus., Hearing Before the Subcomm. on Tech., Terrorism, and Gov't Info. of the Senate Comm. on the Judiciary, 105th Cong. (1997) (testimony of Dorothy E. Denning, Professor of Computer Science, Georgetown University).

(165.) Id.

(166.) ISSUE UPDATE, supra note 2, at 82. The report further defines "confidentiality" as referring to "how data collected for approved purposes will be maintained and used by the organization that collected it, what further uses will be made of it, and when individuals will be required to consent to such uses." Id.

(167.) Id. (stating that the phrase "right to be let alone" was coined by Judge Cooley in 1888 in THOMAS M. COOLEY, LAW OF TORTS 29 (2d ed. 1888)). Two years after Cooley coined the phrase, it was picked up and made the fodder of legal scholarship. See Louis Brandeis & Samuel Warren, The Right of Privacy, 4 HARV. L. REV. 193 (1890).

(168.) See Carey v. Population Servs. Int'l, 431 U.S. 678 (1977); Eisenstadt v. Baird, 405 U.S. 438 (1972); Griswold v. Connecticut, 381 U.S. 479 (1965).

(169.) See, e.g., ISSUE UPDATE, supra note 2.

(170.) See, e.g., ACM CODES, supra note 12, at ch. 5 (stating chapter title as "The Privacy View: The Importance of Encryption").

(171.) 143 CONG. REC. S4684-02 (daily ed. May 19, 1997) (Statement of Senator John Kerry).

(172.) SECURITY AND FREEDOM THROUGH ENCRYPTION (SAFE) ACT, H.R. REP. No. 105-108, at 6 (1997).

(173.) NATIONAL RESEARCH COUNCIL, COMPUTERS AT RISK: SAFE COMPUTING IN THE INFORMATION AGE 157 (1991) [hereinafter "Safe Computing"]. This National Research Council report addressed computer security and trustworthiness issues. See id. at vii. Though already out-of-date given the changes in regulations and the explosive growth of the encryption marketplace, the report, nonetheless, identified key industry concerns that are still relevant today. See id. at 158. The report quoted congressional testimony of Digital Equipment Corporation:
 The real difficulty arises if a vendor considers building security into a
 "mainstream" commercial product. In that event, the system's level of
 security, rather than its processing power, becomes its dominant attribute
 for determining exportability. A computer system that would export under a
 Commerce Department license with no delay or advance processing would
 become subject to the full State Department munitions licensing process. No
 vendor will consider subjecting a mainstream commercial product to such
 restrictions.


Id. (quoting testimony of Digital Equipment Corporation before the House Subcomm. on Transp., Aviation, and Materials (July 1990)).

(174.) See Safe Comptuing at 152-57.

(175.) See id. at 156.

(176.) Id. The report may have understated the importance of individual consumers in this growing marketplace.

(177.) See Tanenbaum, supra note 18, at 21.

(178.) See Altman & McGlone, supra note 4, at 505.

(179.) See id.

(180.) See SECURITY AND FREEDOM THROUGH ENCRYPTION (SAFE) ACT, H.R. REP. No. 105-108, at 29 (1997).

(181.) See Altman & McGlone, supra note 4, at 505 & nn. 58-59 (citing a report of the Computer Systems Policy Project, an organization comprised of CEOs of Apple, Compaq, Data General, Digital Equipment, Hewlett-Packard, IBM, NCR, Silicon Graphics, Stratus Computer, Sun Microsystems, Tandem, and Unisys).

(182.) See NATIONAL RESEARCH COUNCIL, CRYPTOGRAPHY'S ROLE 1N SECURING THE INFORMATION SOCIETY (1996).

(183.) See Encryption, Key Recovery, and Privacy Protection in the Info. Age: Hearings on S. 376 Before the Senate Comm. on the Judiciary, 105th Cong. 68 (1997) (testimony of Kenneth W. Dam, chair, Committee to Study National Cryptography Policy, National Research Council, and professor of law, University of Chicago Law School).

(184.) Id. at 69.

(185.) Id. at 8.

(186.) See S. 1726, "Pro-Code" Act, supra note 13.

(187.) See Tanenbaum, supra note 18, at 21.

(188.) See PHILIP ZIMMERMANN, 2 PGP USER'S GUIDE, vol. II (1994).

(189.) See S. 1726, "Pro-Code" Act, supra note 13, at 8. See also supra note 104 for a discussion on a patent infringement lawsuit between RSA and PGP.

(190.) S. 1726, "Pro-Code" Act, supra note 13, at 8.

(191.) See PGP, Inc., PGP for Personal Privacy (visited Nov. 25, 1997) <http://www.pgp.com/products/pgp-personal-55.cgi>; PGP, Inc., PGP Business Products (visited Nov. 25, 1997) <http://www.pgp.com/products/business/ pgpbiz-55.cgi>.

(192.) Id.

(193.) PGP, Inc., PGP Business Products (visited Nov. 25, 1997) <http://www.pgp.com/products/business/pgp-biz-55.cgi>.

(194.) See Netscape Communicator 4 and Navigator 4, Frequently Asked Questions (visited Oct. 18, 1997) <http://home.netscape.com/communicator/ V4.0/Faq/index.html#secureEmail>.

(195.) See Microsoft Outlook Features (visited Dec. 19, 1997) <http://www. microsoft.com/products/prodref/578_newf.html>.

(196.) See Netscape Communicator, supra note 194; Microsoft Outlook, supra note 195.

(197.) See id.

(198.) See Security, and Freedom Through Encryption (SAFE) Act, Hearing Before the Subcomm. on Courts and Intellectual Property of the Comm. on the Judiciary House, 105th Cong. 45 (1997) (testimony of William P. Crowell, deputy director, National Security Agency).

(199.) See id.

(200.) See id.

(201.) See id.

(202.) See id.

(203.) Id. (stating that the Berkeley student recovered the 40-bit key 33% into the program).

(204.) See S. 1726, "Pro-Code" Act, supra note 13, at 11.

(205.) See id.

(206.) See id.

(207.) See id.

(208.) See Economic Espionage Act of 1996, Pub. L. No. 104-294, 110 Stat. 3488 (1996) (codified as amended at 18 U.S.C. [subsections] 1831-39 (1994 & Supp. II 1996)).

(209.) Statement of President William J. Clinton upon Signing H.R. 3723, (1996 U.S.C.C.A.N. 4034).

(210.) See discussion supra Part III(C).

(211.) See supra notes 182-84 and accompanying text.

(212.) John Seiler, The Government Wants Oversight. Here's Why it is a Bad Idea, ORANGE COUNTY REG., Sept. 21, 1997 at 5.

(213.) ISSUE UPDATE, supra note 2, at 84.

(214.) See supra note 165 and accompanying text.

(215.) Key Recovery Alliance, Public Policy Requirements for a Global Key Recovery Infrastructure 1 (visited Mar. 27, 1998) <http://www.kra.org/ whitepapers/Policies.pdf>.

(216.) See SECURE PUBLIC NETWORKS ACT, S.909, 105th Cong. (1997); ENCRYPTED COMMUNICATIONS PRIVACY ACT OF 1997, S.376, 105th Cong. (1997), respectively.

(217.) Key Recovery Alliance, Cryptographic Information Recovery Using Key Recovery 3 (visited Mar. 27, 1998) <http://www.kra.org/whitepapers/ cryptoV1_2.pdf> (defining "trusted third party" as "an entity in a trust relationship with various primary entities").

(218.) See id.

JOHN T. SOMA, Professor of Law, University of Denver College of Law. B.A., Augustana College, 1970; J.D., University of Illinois, 1973; Ph.D. (Economics), University of Illinois, 1975. The authors wish to express their appreciation to the Hughes Research Fund of the University of Denver College of Law for its assistance in this matter.

CHARLES P. HENDERSON, Associate, Holland & Hart, Denver, Co.; B.A., Princeton University, 1988; J.D., University of Denver College of Law, 1998.
COPYRIGHT 1999 Rutgers University School of Law - Newark
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1999 Gale, Cengage Learning. All rights reserved.

 Reader Opinion

Title:

Comment:



 

Article Details
Printer friendly Cite/link Email Feedback
Author:Soma, John T.; Henderson, Charles P.
Publication:Rutgers Computer & Technology Law Journal
Geographic Code:1USA
Date:Mar 22, 1999
Words:12591
Previous Article:Universal service and the digital revolution: beyond the Telecommunications Act of 1996.
Next Article:Gun detector technology and the special needs exception.
Topics:



Related Articles
Decoding encryption policy.
Decoding encryption policy.
Know the code: making encryption safe, legal - and not rare.
Clipping encryption.
Congressional regulation.
How to keep your secret data secret.
Where the United States goes the world will follow - won't it?
COMMERCE DEPARTMENT ANNOUNCES WINNER OF GLOBAL INFORMATION SECURITY COMPETITION.
Improving data security by protecting tape-based storage.

Terms of use | Copyright © 2014 Farlex, Inc. | Feedback | For webmasters