Printer Friendly

Mainstream Loudoun and the future of internet filtering for America's public libraries.

I. INTRODUCTION

The First Amendment guarantee of free speech is under heavy fire today. Everyone, from Congress to local librarians, is arguing over how best to protect young people from potentially harmful online information. Our public libraries face the difficult decision of whether or not to install filters to block Internet sites that the libraries, and/or the communities in which they are situated, deem inappropriate for young persons. Supporters of free speech oppose the use of filters because, sometimes accidentally and sometimes intentionally, filters block many sites containing constitutionally protected speech. Additionally, opponents of filters in public libraries lament the fact that filters take the traditionally parental role of monitoring what children see and hear and relinquish that control to librarians and law-makers.

Since the Internet's creation there has been an explosive increase in the number and type of Internet sites.(1) Concomitantly, concern has grown about how best to shield young people from inappropriate information available on the Internet. On the federal level, Congress has demonstrated its concern by proposing rights-constricting legislation that would require schools and libraries receiving federal funding for Internet access to install filtering software.(2) Many states have already passed similar legislation. On the local level, some libraries have taken the initiative and installed filters. In some cases this has led to public outcry.

This note will explore Mainstream Loudoun v. Board of Trustees of Loudoun County Library, a case of first impression in the Eastern District of Virginia, in which concerned citizens successfully challenged their library's decision to install Internet filters.(3) The main issue in Mainstream Loudoun was whether a public library violated the First Amendment rights of its patrons by enacting a policy that prevented patrons' access "to certain content-based categories of Internet publications" through the installation of blocking software on library computers.(4) This case, and others like it, may have far reaching effects on the ability of libraries, schools, state universities and government offices to install blocking software on their computers and thus limit and control what their patrons, students and employees can access on the Internet.

Part II will provide background for Mainstream Loudoun, present the procedural history and analyze the court's holding. Part III will survey recent federal and state Internet filtering legislation and discuss how groups, including the American Library Association and the American Civil Liberties Union ("ACLU"), have spoken out against it. This legislation may dim the victory of Mainstream Loudoun by requiring public libraries to install Internet filters in exchange for federal funding. The libraries would be able to choose what type of software to install, but no filter is completely effective.

Part IV will introduce the Clinton Administration's views on the regulation of Internet content. Part V will explain how filters work and how they often fail. Part VI will address the dilemma of the public librarians who must decide whether to install these inadequate filters or expose themselves to possible liability at the hands of angry parents. Librarians facing this difficult decision can look to recent U.S. Supreme Court rulings, discussed in Part VII, on related constitutional issues that may help predict how the Internet filtering issue will be resolved.

There are other possible solutions, as seen in Part VIII -ones that may take a little more work than hastily-passed legislation or easily-installed filters, but that have greater hope for protecting minors from possibly harmful material while securing the rights of adults. These causes are worth the effort because:
 [T]he peculiar evil of silencing the expression of an opinion is, that it
 is robbing the human race ... those who dissent from the opinion, still
 more than those who hold it. If the opinion is right, they are deprived of
 the opportunity of exchanging error for truth: if wrong, they lose ... the
 clearer perception and livelier impression of truth, produced by its
 collision with error.(5)


II. THE MAINSTREAM LOUDOUN CASE

A. Background

Loudoun County is situated, rather ironically, in northern Virginia, home to many computer and Internet companies, including America Online.(6) On October 20, 1997, the Loudoun County Library Board of Directors adopted an Internet filtering policy that has been called the "most restrictive" in the United States.(7) The Loudoun County Library "Policy on Internet Sexual Harassment" ("the Policy") requires that "[s]ite-blocking software ... be installed on all [library] computers" to "block child pornography and obscene material (hard core pornography)" and "material deemed Harmful to Juveniles ... (soft core pornography)."(8) The Board had the library edition of Log-On Data Corporation's X-Stop blocking software installed on all of its Internet-connected computers,(9) citing as a rationale the need to protect library users and employees from sexual harassment.(10)

An organization of local residents called "Mainstream Loudoun" filed a complaint against the library board in the Eastern District of Virginia on December 22, 1997, under 42 U.S.C. [sections] 1983.(11) The group was soon joined in the suit by seven intervening website operators whose Web pages were blocked by the library's filtering software.(12) The plaintiffs sought a permanent injunction prohibiting the library from using such software on its public access computers.(13)

The Mainstream Loudoun plaintiffs contended that the software blocked their access to protected speech.(14) They argued that the X-Stop software that the library had installed is typical of all Internet filters in that it is clumsy and ineffective, often blocking innocuous sites, such as the Quaker Home Page and the Zero Population Growth web site, while letting through ones with obscene content.(15)

The complaint further alleged that filter blocking is not contentneutral, that the library's Policy lacked clear criteria for blocking decisions, and that the library's unblocking policy "unconstitutionally chills plaintiffs' receipt of constitutionally protected materials."(16) The complaint stated that "[defendants] are in effect `removing books from the shelves' of the Internet by blocking many Internet sites with valuable educational, political, literary, artistic, social, and religious speech that would otherwise be available to library patrons."(17) Plaintiffs therefore claimed a violation of their First Amendment right of free speech.(18)

B. Defendants' Immunity Claims

On February 2, 1998, the defendants filed a Motion to Dismiss and a Motion for Summary Judgment, claiming alternatively legislative immunity, immunity under the Communications Decency Act, and qualified immunity.(19) Defendants based their claim to legislative immunity on Bogan v. Scott-Harris in which the Supreme Court explicitly extended [sections] 1983 absolute immunity to local government officials for their legislative activities.(20) Plaintiffs argued that the library board members were not entitled to legislative immunity because they were appointed rather than elected and thus there is no direct electoral check on the board members' actions.(21)

The presiding United States District Court Judge, Leonie Brinkema, found that the library board and its members were entitled to absolute immunity for their decision to adopt the Policy, under "a discretionary exercise of rulemaking authority."(22) However, under the Virginia Code, the library board is charged with the "management and control of [the] free public library system," and therefore the board's choice of the filtering software was an act of enforcement, not legislation or administration.(23) Based on Virginia case law, the judge held that the library board was not entitled to legislative immunity in its enforcement role.(24)

Defendants also claimed immunity under section 509 of the Telecommunications Act of 1996, now codified at 47 U.S.C. [sections] 230, "Protection for private blocking and screening of offensive material."(25) Judge Brinkema ruled that the defendants' reliance on this was not supported by the section's legislative history or relevant case law.(26) The Judge pointed to a Fourth Circuit opinion stating that Congress enacted [sections] 230 to minimize state regulation of Internet speech, not to shield government regulation of Internet speech from judicial review.(27)

Judge Brinkema denied the defendants' immunity claims in her April 7 Memorandum of Opinion and Order. She also made a ruling that would come to be one of the most significant results of the case. Judge Brinkema ruled that the constitutional standard of strict scrutiny would be used to evaluate the Board's decision to install filters on library computers.(28) On October 2, 1998, the judge announced that she would decide the case on summary judgment pleadings and evidence submitted by both sides, thus canceling the trial which had been scheduled to begin on October 14, 1988.(29)

C. The First Amendment Question and the Pico Precedent

Defendants conceded that the Board's Policy prohibited access to speech on the basis of its content.(30) They argued, however, that the First Amendment does not limit the decisions of a public library regarding the provision of public access to information on the Internet.(31) Thus, the question before the court was "whether a public library may, without violating the First Amendment, enforce content-based restrictions on access to Internet speech."(32)

The court stated that there were no cases directly on point, but all parties agreed that the most analogous authority was Board of Education v. Pico, 457 U.S. 853 (1982).(33) In Pico, the Supreme Court reviewed the case of a local board of education that removed books from a high school library because the board believed them to be "anti-American, anti-Christian, anti-Sem[i]tic, and just plain filthy."(34) The Court affirmed the Second Circuit's remand of the case for a determination of the board's motives.(35) Justice Brennan wrote the Pico plurality opinion, stating that the First Amendment limits the government's right to remove materials from a high school library on the basis of their content.(36) He reasoned that the right to receive information is inherent in the right to speak and that the State cannot "contract the spectrum of available knowledge."(37) Justice Brennan emphasized that this principle is critically important because of the school library's role as a site "for free and independent inquiry."(38) Unfortunately, Brennan and the other plurality justices were not able to achieve a majority.(39)

Justice Blackmun wrote a concurring opinion in Pico in which he focused on the school board's discrimination against unpopular ideas, not on an individual's right to receive information.(40) He stated that Pico concerned two competing interests: public high schools' responsibility to teach young people, and the First Amendment's prohibition of content-based speech regulation.(41) Justice Blackmun noted that the State must normally show a compelling reason for content-based regulation.(42) He felt, however, that in the context of public high schools, less protection is merited.(43) He agreed with the plurality that a school board could not remove books from shelves merely because the board disapproved of their content, but stated that a board could limit a library's collection based on educational needs or budgetary limitations.(44)

Joined by three other Justices, Chief Justice Burger dissented.(45) He stated that while the First Amendment guaranteed free speech, it did not go so far as to obligate the government to provide such speech in high school libraries.(46) Burger argued that forcing public schools to do so would be inconsistent with their duty to establish an appropriate curriculum.(47) This would undoubtedly be one of the arguments made by Congressional supporters of filtering legislation.

In Mainstream Loudoun, the defendants contended that the Pico plurality opinion had no application to this case because it addressed only decisions to remove materials from libraries and specifically declined to address library decisions to acquire materials.(48) Defendants described the Internet as being like a "vast Interlibrary Loan system," and argued that restricting Internet access to selected materials is really a decision not to acquire certain books, not a decision to remove already purchased books from a library's shelves.(49) As such, defendants argued, the Mainstream Loudoun case was outside the scope of Pico.(50)

In response, plaintiffs described the Internet as a "single, integrated system."(51) Plaintiffs analogized the Internet to a set of encyclopedias, arguing the Board purchased the whole set and cannot "black out" certain articles it deems inappropriate.(52) Plaintiffs also contended that Pico established a rule that removal decisions by libraries may not be resolved on summary judgment.(53)

The Mainstream Loudoun court agreed with the plaintiffs' analogy of Internet filtering to removing books from library shelves. Judge Brinkema interpreted Pico to stand for the "proposition that the First Amendment applies to, and limits, the discretion of a public library to place content-based restrictions on access to constitutionally protected materials within its collection."(54) A public library, `like other enterprises operated by the State, may not be run in such a manner as to prescribe what shall be orthodox in politics, nationalism, religion, or other matters.'"(55) The judge concluded that the defendants had misconstrued the nature of the Internet, and because they chose to provide Internet access to patrons, the First Amendment restricted the limitations they could place on patron access.(56)

D. The Strict Scrutiny Standard and The November 23, 1998 Decision

In addition to Judge Brinkema's landmark ruling on the nature of the Internet, her memorandum is significant because in it she held that a public library must satisfy strict scrutiny before engaging in content-based regulation of protected speech.(57) In order to survive strict scrutiny, the law or policy in question must be "narrowly tailored" to serve a "compelling government interest."(58) Few government actions restricting speech pass this test.(59) In denying summary judgment for defendants and applying the strict scrutiny test, Brinkema's ruling was consistent with the Pico decision.(60)

After Judge Brinkema's denial of the defendants' motion for summary judgment, the parties conducted extensive pretrial discovery.(61) In September of 1998, the plaintiffs and intervenors filed their motions for summary judgment and the defendants filed their opposition brief.(62) After hearing oral argument on the motions, Judge Brinkema issued a statement on October 2 that the case would be decided on the cross motions and supporting evidence and pleadings, thus canceling the October 14, 1998 trial date.(63) On November 23, 1998, the judge issued her Order, granting the plaintiffs' and intervenors' motions for summary judgement, along with a memorandum opinion explaining her decision.(64)

The library had argued that its Policy satisfied the strict scrutiny criteria. The court agreed that "minimizing access to illegal pornography" and "avoidance of creation of a sexually hostile environment" are compelling government interests.(65) However, the court ruled that the Policy could not survive strict scrutiny because it was not narrowly tailored to accomplish that goal.(66) The Policy prohibited access to three types of speech: obscenity, child pornography and materials deemed harmful to juveniles.(67) The first two types of speech are not protected and federal law makes transmitting them on the Internet illegal.(68) Thus, the library was perfectly within its rights in restricting their patrons' access to those types of speech. The problem with the X-Stop filter used by the library, however, is that it restricts adults' access to the third type of speech--non-pornographic, constitutionally protected speech that the software manufacturer has deemed harmful to minors.(69) This content-based Internet regulation limits adults to children's standards in contravention of the Supreme Court's ruling in Reno v. ACLU that "[t]he Government may not `reduc[e] the adult population ... to ... only what is fit for children.'"(70)

The Loudoun County Library Board could have prevailed by convincing the court their Policy was absolutely necessary to protect children from obscene and pornographic material.(71) The Board, however, failed to prove that it had tried to implement a less restrictive alternative before installing the filters.(72) Such an alternative would be one that did not limit websites with content otherwise freely accessible to adults through other media.(73) The judge therefore found the Loudoun Policy to be overinclusive.(74)

At oral argument, defendants had also argued that a public library is not a public forum, and therefore not deserving of strict scrutiny.(75) In her November 23, 1998 memorandum opinion, Judge Brinkema held the library to be a limited public forum because one of its express missions is the "receipt and communication of information through the Internet," which it explicitly offers to the public.(76) She held that the library cannot then limit the receipt and communication of information based on the information's content.(77)

Finally, Judge Brinkema held that the library's unblocking policy was an unconstitutional "prior restraint."(78) The unblocking policy stated that library patrons who had been denied access to a site could submit a written request to have it unblocked.(79) The request had to contain the patron's name, telephone number and a detailed reason why he or she wanted to see the site. The library staff would then decide whether or not to unblock the site.(80) The First Amendment is violated when "the government (here in the form of the public library) makes the enjoyment of protected speech contingent upon obtaining permission from government officials to engage in its exercise under circumstances that permit government officials unfettered discretion to grant or deny the permission."(81) Prior restraints are disfavored by the First Amendment, even when used against illegal speech.(82)

Judge Brinkema stated that it was wrong of the library to completely entrust all blocking decisions to the filter manufacturer, Log-On Data Corporation.(83) Monitoring what children see and hear is traditionally a parental role and should not be so easily delegated, if at all, to a commercial entity that is primarily concerned with profit.

The Loudoun County Library Board filed an appeal with the Fourth Circuit on December 23, 1998.(84) However, on April 19, 1999, the board voted seven to two to drop the appeal.(85) Instead, the county libraries briefly suspended all Internet access while reformulating their policy.(86) The new policy makes filtering optional for adults and allows parents to decide whether to require filtering for their children or to let the children decide for themselves.(87) The policy clearly points out that the filters are "by no means foolproof" and warns that the library cannot control Internet content.(88) The policy also states that its mission is to "offer the widest possible diversity, views and expressions, insuring access to all avenues of ideas, to as many library customers as it can."(89)

The Mainstream Loudoun case is a significant victory for free speech on the Internet and the continuing integrity of the First Amendment. An appeal might have resulted in an affirmance by a higher court which would have been even more persuasive. Without a Supreme Court ruling upholding the strict scrutiny test for library actions to limit access to the Internet and stressing the importance of parental and community-level participation in protecting youth online, we remain vulnerable to a Congressional quick fix solution. There are several recently introduced bills that, if passed, may render the free speech victory in Loudoun County moot.

III. INTERNET FILTERING LEGISLATION

A. The Communications Decency Act Held Unconstitutional

The Communications Decency Act ("CDA"), adopted in 1996, made it a felony to knowingly send obscene or indecent materials to minors.(90) While failing to define "indecent," the CDA was designed to restrict minors' access to material regarding what legislators considered offensive acts related to sex or the body's excretory functions.(91) Under the CDA, adults would have had to refrain from Internet speech that was unsuitable for minors whenever there was a possibility that minors could access the speech.(92)

In September 1997, in Reno v. ACLU, the Supreme Court held unconstitutional the portion of the CDA that prohibited "indecent" and "patently offensive" materials on the Internet.(93) The Court allowed the sections prohibiting obscene material to remain in effect because obscene materials do not enjoy First Amendment protection.(94) In striking down the CDA, the Supreme Court emphasized that a statute that burdens adult speech is unacceptable even if a less restrictive alternative is available, a criteria later imposed by the Mainstream Loudoun court.(95) The Court noted the availability of "reasonably effective" blocking software as an alternative suitable for use by parents.(96)

B. New Internet Filtering Bills

The CDA may be gone, but three hundred and twenty-four bills with the word "Internet" in them were filed in the 105th Congress, as opposed to seventy-five the year before.(97) So far in the 106th Congress (1999-2000), 453 such bills have been introduced.(98) Ever since the Court overturned the CDA, several members of Congress, Republicans and Democrats alike, have proposed more narrowly tailored legislation that would accomplish the same purpose: to protect children from pornography on the Internet.(99) The names of these bills are confusingly similar, but all attempt to tie funding to filters.

One tenacious bill is the "Children's Internet Protection Act."(100) It was first introduced in the 105th Congress by Senator (and one-time presidential hopeful) John McCain, Republican of Arizona, as the "Safe Schools Internet Act."(101) The bill was sponsored in the House by Representative Bob Franks, Republican of New Jersey.(102) This bipartisan bill sought to require schools and libraries receiving federal Internet access subsidies to buy filters for their computers.(103) The bill stated that "the determination of what material is to be deemed harmful to minors shall be made by the ... library ..." and "[n]o agency or instrumentality of the United States Government may ... establish criteria ... (or) review the determination."(104) Although this bill was passed by the Senate on July 23, 1998, at the close of the 105th Congress it had not been enacted into law.(105)

The sponsors reintroduced the bill in the 106th Congress as the "Children's Internet Protection Act, which was passed as an amendment to the juvenile justice bill."(106) It requires schools receiving e-rate subsidies to use filters to "protect children from Internet pornography."(107) The Act adds a new subsection to Section 254 of the Communications Act of 1934, 47 U.S.C. [sections] 254, which is the statute upon which the Federal Communications Commission's schools and libraries program is based.(108) The program is popularly known as the "e-rate" or the "Gore tax," and subsidizes computer networking, Internet access and telephone service for schools and libraries.(109) To qualify for the program, a school or library must install technology to block material deemed harmful to minors.(110) The new version of the bill is stronger than the old one because it now contains language that specifically requires that the school or library receiving a subsidy use the software.(111) Under this Act, the FCC has rule making authority.(112)

There are several other similar bills pending. Representative Ernest Istook introduced H.R. 2560, The "Child Protection Act of 1999," on July 20, 1999, that would require public schools and libraries that receive federal funding used for acquiring computers and online services to install protective software.(113) On August 6, 1999, the bill was referred to the House Subcommittee on Early Childhood, Youth and Families. There is also H.R. 896, "The Children's Internet Protection Act," introduced by Representative Franks on March 2, 1999.(114) It would require schools and libraries to use filters on computers with Internet access in order to receive universal service assistance.(115) The bill passed the House as part of the bloated H.R. 1501, the "Consequences of Juvenile Offenders Act."(116)

Finally, there is S. 1545, introduced by Senator Santorum. The "Neighborhood Childrens' Internet Protection Act," as it is called, would require schools and libraries receiving universal service assistance to install systems or implement acceptable policies for blocking or filtering Internet access to matter inappropriate for minors.(117) It also would require the study of available Internet blocking or filtering software.(118)

One bill that has already been passed is the Child Online Protection Act ("COPA").(119) also known as "CDA II", or "son of CDA".(120) It was signed into law by President Clinton in October, 1998.(121) COPA prohibits commercial distribution of pornography to minors on the Web.(122) The law carries penalties of up to six months in prison and $50,000 in fines for commercial sites that make sexually explicit material available to minors.(123) It requires businesses to provide electronic age verification systems to check the age of online visitors.(124)

Lawrence Lessig, a Harvard Law expert on cyber law, said that COPA, more narrowly tailored than its predecessor, could be judged constitutional.(125) Nevertheless, he urged Congress not to pass it not only because of the privacy issues it raises, but also because the Internet is a rapidly developing phenomenon and until we know more of how it will develop we should not pass laws to entrench technologies that may soon become unnecessary or ineffective.(126) Indeed, studies show that current filter technology is ineffective and there are no indications that it will improve any time soon.(127)

States have also become engaged in the effort to protect children from obscene content on the Internet. This is evidenced by the number of states considering Internet censorship laws in recent years: at least twenty-five states between 1993 and 1998.(128) In 1998, ten states considered bills requiring libraries and/or schools to install filtering software.(129) Many states already have laws regulating Internet speech. For example, Connecticut and Florida passed laws makings ISPs responsible for permitting their subscribers to violate obscenity laws.(130) In 1999, Arizona adopted a statute(131) requiring public schools and libraries that provide public access computers to install blocking software "that will limit minors' ability to gain access to material that is harmful to minors" or to use an ISP that provides filtering service.(132)

State laws such as these, however, have not held up to constitutional scrutiny.(133) Federal courts in Virginia, Georgia, New York and New Mexico have found these censorship laws unconstitutional in First Amendment challenges brought by the ACLU.(134) Recently, in ACLU v. Johnson, the Tenth Circuit rejected a New York state law that banned Internet speech deemed "harmful to minors," on the basis that such a law censored valuable, protected speech for adults.(135) Ironically, just a few days after this law was struck down, the New York Times reported that New York City schools were blocking access to many valuable sites using I-Gear censoring software.(136) This blocking led students from Benjamin Cardozo High School to compile a list of wrongly-banned sites and give it to a local ACLU affiliate in the hope that the organization will take legal action to end the Internet blocking.(137)

IV. THE CLINTON ADMINISTRATION'S STANCE AND PUBLIC RESPONSES TO INTERNET FILTERING LEGISLATION

It is not surprising that the spirit of the pending federal legislation appears to be in direct opposition to the Clinton Administration's views on the Internet. On July 1, 1997, the Administration issued a report titled, "A Framework for Global Economic Commerce," detailing its agenda on trade and technology and its aim to spur the growth of global commerce across the Internet.(138) The Framework is built around five central principles, the first two of which are salient here.(139) The first principle is that the private sector should lead this growth through industry self-regulation and limited government regulation.(140) The second is that the government should avoid imposing unnecessary restrictions on electronic commerce because such regulation may thwart development and may not be able to keep pace with rapid technological developments in the 21st century.(141)

The Framework names three areas in which international agreements are necessary to preserve the Internet as a "nonregulatory medium": financial issues such as customs and taxation, legal issues such as privacy and security, and market access issues such as pricing.(142) Also, according to the Clinton policy, the United States supports the free flow of information across borders.(143) The Report addresses regulation of adult content, taking the position that parents should be able to block their children's access to inappropriate material, but stating that traditional governmental content regulation applied to the broadcast media will not be applied to the Internet.(144)

Regardless of the Executive Branch's expression of support for free speech on the Internet, the passage of the COPA and the recent rash of proposed legislation mandating filtering has caused a backlash from groups dedicated to protecting free speech.(145) The ACLU has been the most vocal and, in conjunction with other civil liberties groups, it has vigorously opposed mandatory filtering.(146) This is because many filters inadvertently block harmless, protected speech, such as websites on breast cancer and AIDS awareness, a page devoted to Liza Minelli, and the site of a small New Jersey green grocer.(147) Many also intentionally block sites relating to controversial topics such as homosexuality.(148)

The ACLU joined with the Electronic Freedom Foundation in June 1998 in warning that forcing schools and libraries to install blocking software would cost some communities hundreds of thousands of dollars in software and staffing requirements.(149) The ACLU also joined with several other activist groups to form the Internet Free Expression Alliance.(150) The group issued the "Joint Statement for the Record on Legislative Proposals to Protect Children from Inappropriate Materials on the Internet."(151) The Joint Statement was submitted to the House Commerce Committee's Subcommittee on Telecommunications, Trade and Consumer Protection in September 1998.(152) It expressed the concerns of library and academic groups, free speech and journalistic organizations about pending legislation that could have a negative impact on freedom of speech on the Internet.(153) The group quoted the Supreme Court saying that the Internet is a venue where "any person can become a town crier with a voice that resonates farther than it could from any soapbox."(154)

In drafting this much-opposed legislation, Congress has also been criticized for acting with too much haste and too little deliberation on the matter of Internet filtering.(155) The Center for Democracy and Technology has criticized Congress for not holding appropriate hearings on these bills.(156) Both the original version of Senator McCain's bill and the COPA were both approved by the Senate during the 105th Congress as amendments to appropriations bill S. 2260, and neither had any floor debate.(157) As the Center fought the CDA, it has vowed to fight mandates for restricting free speech on the Internet by the use of notoriously clumsy software.(158)

In November 1999, the ACLU succeeded in having overturned a New Mexico state law banning Internet speech deemed "harmful to minors."(159) The ACLU argued that such laws censor valuable speech for adults, including discussions of women's health, gay and lesbian issues, censorship and civil liberty issues.(160) The ACLU then moved from the 10th Circuit to the Federal Court of Appeals in Philadelphia to argue that COPA, with similar "harmful to minor" language, is equally unconstitutional.(161)

Finally, various studies, such as one released in June 1998 by The Censorware Project, an Internet watchdog organization, claimed makers of filtering software often mislabel as sexually explicit free speech and advocacy sites.(162) These manufacturers build what are called "proprietary site lists" containing the categories of information their software will block.(163) The problem with this is that it privatizes Internet content and places control and accountability solely in the hands of companies whose greatest concern is to make a profit.(164) The ACLU and other groups have been quick to seize upon information regarding filters' ineffectiveness and the shifting of content-control to private companies, and they use it as ammunition in their fight against the proposed Internet filtering legislation.

V. HOW FILTERS WORK

This debate is centered on filters that attempt to hide "offensive" sites on the Internet. The best known Internet platform is the World Wide Web, which contains in excess of 100 million documents and increases by the thousands daily.(165) To gain access to the Web, a person uses a web "browser" to display, print and download documents.(166) To find documents with an unknown address the user needs a "search engine," like Yahoo! or Lycos.(167) The user types in one or more words as a search request and receives a list of matching sites.(168)

With the use of blocking software certain sites can be hidden or omitted.(169) The software vendor may make it block as few as six categories of information or many more.(170) Some of the blocked categories are (1) criminal or violent activity; (2) "adult" material; (3) hate speech; and even (4) sports and entertainment.(171) There are several different types of filtering methods, such as keyword blocking and Web rating systems.(172) Filters using keyword blocking use a list of terms the manufacturer thinks consumers find objectionable.(173) These terms relate to sexuality, human biology and sexual orientation, such as the words "queer" and "penis."(174) One filter, Cybersitter, also blocks terms like "death," and Cyber Patrol blocks words as seemingly innocuous as "pain."(175)

There are two main problems with most software configured to block words and letters. The first is that they are innately clumsy. For example, software configured to block "xxx" filters out such things as Super Bowl XXX web sites.(176) Blocking the word "breast" means breast cancer awareness sites are blocked. Consecutive letters "s," "e," and "x" block sites dedicated to such topics as the Mars Explorer. The second problem is that keyword blocking relies on the assumption that words have only one meaning.(177) This is especially problematic due to the international character of the Internet. One example of this problem is the word "Roger": it is never blocked, but in Australia it is slang for "penis." Another is the word "cock": slang for "penis," but often blocked even when it is being used in relation to guns or birds.(178)

Site blocking is another filtering method. With it, people identify objectionable Internet sites according to certain criteria and place them on either access or denial lists.(179) Site blocking has two major flaws: first, files may get through if they do not have content exactly resembling what is being flagged (and this is only really accurate with pornography); and second, many innocuous sites are blocked.(180)

There are four other major blocking methods.(181) Protocol blocking denies access to all the resources of a particular type of Internet service, such as telnet, gopher, and Usenet, where certain chat rooms are located.(182) Time blocking limits user access by time of day.(183) Client blocking assigns access levels to specific locations, such as the children's room in a library.(184) User blocking is similar in that a person can only log in at their user level, "child" or "adult."(185) Finally, some support the institution of Web rating systems which would warn users of objectionable content, much like what is currently used on television and music packaging labels.(186)

As previously discussed, keyword blocking is inefficient.(187) The main reason why site blocking and rating technologies do not work is the sheer size and rapid rate of growth of the Web.(188) The Web contains over one billion pages and 20,000,000,000,000 characters of text.(189) Over two million pages are added daily, twenty-five new pages per second.(190) Internet search engines index the Web and put sites in categories such as science and education, government, health, personal, community, religion and pornography.(191) Filter manufacturers determine which sites to block by checking the search engines' lists of pornographic sites.(192) Due to these huge numbers, it can take several months for such sites to get listed by the search engines, and the search engines together only cover less than fifty percent of the total number of Web pages.(193) Thus, even if filters worked perfectly, they would still leave a tremendous amount of the Web unfiltered.

Those who attempt to rate web sites instead of blocking them have an equally difficult task. They would need to employ an army of staff members to search the Web around the clock for new sites.(194) Furthermore, once a site has been rated, its content may change.

With just this little bit of knowledge it becomes unclear why Congress and some libraries and schools blindly embrace filtering technology. A July 1999 study by the Censorware Project shows why these parties should be concerned.(195) Censorware conducted a study of Bess, a filter made by N2H2 that is used in schools in Oklahoma, Tennessee, Wisconsin, Maine, California, Massachusetts, Washington, Florida and New York.(196) The study showed that many sites were inappropriately blocked, while many sites containing hard-core pornography were not, even though all the pornography categories in the software were activated in each test.(197) For example, Bess allowed access to sites such as: <asianslut.com>, <eliteporn.com>, <orgasm.com>, and <xxxfetish.com>.(198) With these results, it becomes difficult to argue that some filtering is better than none at all.

VI. LIBRARIES HAVE A TOUGH CHOICE TO MAKE: INADEQUATE FILTERS OR EXPOSURE TO LIABILITY

In Alameda County, California, a twelve-year-old boy used his local library's computer to access the Internet and download pornographic pictures, which he later printed at a relative's home.(199) On May 28, 1998, his mother filed suit against the library for failing to limit children's access to the Internet.(200) In the complaint, she alleged that the library's provision of unfiltered Internet access wastes public funds, creates a public nuisance and makes the library unsafe for children.(201) The boy's mother sought to compel the library to install filtering software.(202) The library had considered installing filtering software but decided against it.(203) Fortunately, the case was dismissed.(204)

As no Internet filtering bill has yet been passed, and there is no dispositive Supreme Court case on the constitutionality of filtering Internet content, public librarians are left without a clear understanding of their legal rights and responsibilities to their patrons. Those who decide not to install filters face legal liability, while those who opt for filters have the difficult task of selecting a filter that will fit their goals and budget.(205)

A Practical Guide to Internet Filters by Karen G. Schneider is a book by a librarian written for librarians deciding whether or not to install a filter, and if so, which one.(206) Schneider describes the Internet as a great benefit to libraries, expanding services in even the smallest libraries in unheralded ways.(207) The Internet allows libraries to offer "government, educational, not-for-profit and self-published resources" to which libraries otherwise might not have access.(208)

Schneider explains the traditional way in which libraries select what they are going to offer their patrons, acting in what she calls "in loco community."(209) Librarians select books, CD-ROMs and databases from vendors' catalogs that Schneider describes as "a level of culling."(210) These catalogs eliminate works such as bomb-making guides and pornography.(211) The Internet for the most part eliminates this phase of content control.(212)

Schneider points out that for some users, the library is the only place they can get Internet access.(213) That is especially true of children in low-income rural and inner-city areas,(214) adding another facet to the Internet filtering debate: whether it is a violation of equal protection to let affluent children, who likely have computers at home and in school, search the entire Internet while less fortunate children can only access the (poorly) filtered version in libraries.

In the time since Congress struck down the CDA there has been tremendous growth in the blocking software market.(215) Analysts at an international technology consulting firm reported that software makers sold $14 million in blocking software in 1997 and predicted that total sales will increase to over $75 million by the end of 2000.(216) There are various types of software available: LAN-based software for networks; stand-alone proxy-server software that acts as an "Internet gateway" for the server; proxy-server "plug-ins" that work in conjunction with existing proxy-server software; firewall software that sits on a Unix framework; and client software for individual workstations.(217) Client software products like Cybersitter and Cyber Patrol are very popular with families because they can be easily installed on home computers.(218) Filtering can also be obtained online through an Internet service provider (ISP) for a few extra dollars per month.(219)

Many library boards have forced the libraries to install some type of filtering software despite the objections of the American Library Association ("ALA").(220) In its 1998 statement at the Senate Commerce, Science and Transportation Committee's hearing on indecency on the Internet, the ALA stated:
 [W]e must balance the extraordinary value [new technologies] bring to
 communications and learning with responsible use and careful guidance.
 However, we are concerned that a federal mandate to filter intrudes
 unnecessarily into the prerogatives of local community-based
 institutions--our public schools and public libraries--as well as into the
 professional expertise and judgement of public and school librarians.(221)


The ALA is concerned that Congress is ignoring the central holding of the decision in which the Court overturned the Communications Decency Act.(222) In its statement, the ALA reminded Congress that in Reno v. ACLU the Court unanimously struck down the CDA, which attempted to restrict minors' access to "indecent" and "patently offensive" material on the Internet.(223) That decision provided the Internet with First Amendment protection, equal to that of books, newspapers and public speakers.(224) The Court also recognized that "both children and adults have a constitutional right to access the vast majority of information on the Internet," and "while children's rights to information are not coextensive with adults [sic], broad measures to keep them away from all `indecent' material are both unconstitutionally vague and over broad."(225)

VII. THE 1990'S SUPREME COURT AND FREE SPEECH

The Supreme Court of the early 1990s cut back on First Amendment rights.(226) In the first half of the decade, the Court underwent significant changes: William Brennan was replaced by David Souter in 1990; Thurgood Marshall by Clarence Thomas in 1991; Byron White by Ruth Bader Ginsburg in 1993; and Harry Blackmun by Stephen Breyer in 1994.(227) Except for the replacement of Justice White by Justice Ginsburg, "each Justice was at least `perceived' to have been succeeded by a Justice less sympathetic to First Amendment values."(228) In the 1990's, Supreme Court decisions on First Amendment cases have been anything but predictable.

The Court has upheld several important free speech claims, such as in Reno v. ACLU in 1997.(229) In that case, Justice Stevens wrote in his majority opinion, "[t]he interest in encouraging freedom of expression in a democratic society outweighs any theoretical but unproven benefit of censorship."(230) Five years earlier, in R.A.V. v. City of St. Paul,(231) the Court invalidated a St. Paul, Minnesota ordinance that criminalized symbolic expression that "arouses anger, alarm or resentment in others on the basis of race, color, creed, religion or gender."(232) In the majority opinion written by Justice Scalia, the Court "struck down the law precisely because it singled out ... certain disfavored viewpoints."(233) Advocates of free speech view this case as a significant victory.(234)

Previously, however, the Rehnquist Court had chipped away at the principle of viewpoint or content neutrality with its decision in Rust v. Sullivan.(235) In Rust, the Court upheld the "gag rule" promulgated by the Department of Health and Human Services under President Reagan.(236) The rule prohibits all healthcare workers in family planning clinics that receive federal funding from giving abortion information, even in situations where terminating the pregnancy is in the mother's best medical interest.(237) If a woman asks about abortion the staff must respond: "This clinic believes that abortion is not an appropriate method of family planning."(238) This is not merely a blow to reproductive and privacy rights, but to free speech as well.(239)

In Rust, the Court reasoned that if the government provides funds to an individual or institution, the government may attach strings to those funds.(240) A string could even comprise "a waiver of constitutional rights, including the right to be free from governmental prohibition of speech based upon its viewpoint."(241) As previously discussed, the various bills working their way through Congress, such as The Children's Internet Protection Act, would require libraries receiving federal funding for Internet access to install blocking software on all of their computers.(242) Based on the Rust precedent, the Court would likely uphold the constitutionality of such a law, regardless of the restraints it would impose on the First Amendment rights of Internet users.

In the 1990s, the Court also discarded a long-standing principle of free speech: the "clear and present danger" requirement for limiting speech.(243) According to this principle, only "actual or imminent harm" justifies the restraint of expression.(244) However, in Barnes v. Glen Theatre, Inc.,(245) the Court upheld a South Bend, Indiana ordinance proscribing nude dancing in public establishments.(246) Notably, the Court upheld the ordinance's constitutionality even though "nude dancing is prima facie protected expression,"(247) does not cause actual or imminent harm, or result in harmful secondary effects.(248)

In Barnes, the Court held that "nude dancing is prima facie protected expression," but such expression could nonetheless be prohibited, even if there is no proof that it caused "actual or imminent harm."(249) Realizing that nude dancing did not violate the Constitution, the Court's only option was to base its ruling on a "generalized community interest in `morality.'"(250) As such, the Court now has precedent to bolster future decisions in which the Constitution is not violated but the Court's sense of morality is offended. Quite possibly, the Court will soon have the opportunity to moralize about other types of protected speech, such as web sites dealing with AIDS awareness or gay rights.(251)

Armed with the Barnes precedent, the Court has a strong foundation for upholding the first narrowly-tailored Internet filtering law to reach it. As Justice O'Connor pointed out in Reno v. ACLU, there is also "no question that Congress would have enacted a narrower version of these provisions had it known a broader version would be declared unconstitutional."(252) That seems obvious. What is also obvious, based on the Court's treatment of free speech in non-Internet related cases,(253) is that adults' free speech fights to communicate on the Internet are not as secure as the Mainstream Loudoun and Reno v. ACLU decisions may have led us to believe.

VIII. ALTERNATIVES TO FILTERING

The United States is not the only nation facing the difficult task of keeping pace legislatively with the development of the Internet. Other nations' responses can shed some light on the dimensions of the problem regarding controlling and monitoring Internet content for users of all ages.

A. Foreign Responses

1. China

Internet use in China is heavily regulated,(254) as the government attempts to monitor every site and every user's communications in the name of national security.(255) All users (almost 7 million in 1999 and growing rapidly)(256) must be registered license-holders, and all Internet traffic is routed through a number of firewalls to screen out material the government considers "politically harmful."(257) China presently has two major government-authorized ISPs, JiTong and ChinaNet, under the jurisdiction of the new Ministry of Information Industry (the "MII").(258) These ISPs get competition from approximately 150 independent ISPs, all of which supply monitored Internet access.(259)

The Chinese government never intended full freedom of information via the Internet.(260) Today, however, the Chinese government has a conflict: it seeks to control the flow of information into and out of the country while at the same time fostering the commercial potential of the Internet.(261) China is beginning to recognize the Internet's usefulness as both an electronic exchange and a channel for essential business and legal information.(262) China must increase foreign investment in order to create jobs to replace those lost to the "streamlining" of state-owned enterprises. To do this, the country must create a stable, open commercial environment.(263) One way to achieve this goal is with the help of the Internet, through which foreign investors can learn about Chinese trade practices and laws.(264) Global publication of these laws will help ensure stability and consistency.(265)

Many Chinese Internet companies are looking to the West for the financial support they need in order to grow.(266) For instance, sina.com and sohu.com have raised millions of dollars from investors such as Intel and Dow Jones.(267) Both companies plan to sell shares on NASDAQ.(268) However, while Beijing has recognized the need for openness in the Internet business arena, the government does not intend that openness to carry over into freedom of expression on the Internet. In November 1999, China signed a trade agreement with the United States which, by 2002, could open the door to foreign ownership of up to fifty percent of Chinese Internet companies. The trade agreement may also result in foreign ownership of up to forty-nine percent of Chinese ISPs by 2006.(269) There is still the question, however, of how much freedom the Beijing government will give the Internet. Sina.com and Sohu.com have partnerships with Western media companies like Dow Jones and Reuters, but the government pressures them to provide only financial news.(270) Beijing has also warned them not to provide links to the Web sites of foreign news organizations.(271)

2. The European Community

The European Community provides a view of the Internet at a different stage of development than it is in China. Because Internet use is growing at a tremendous rate the European Community must now respond to the problem of regulating Internet content. It is doing so in several ways, predominantly by enforcing self-regulation of Internet content providers.(272) Under the European Community's plan, ISPs must voluntarily subject themselves to a ratings scheme similar to that used for television and music in the United States.(273)

Announced in Great Britain during September 1996, the Internet Watch Foundation ("IWF") established e-mail, telephone, and fax hotlines for reporting obscene content on the Internet.(274) Once the IWF informs British ISPs that such content is found, the ISPs must remove the content.(275) This type of system can set a dangerous precedent for private censorship. In addition, the IWF's method fails to arrive at the root of the problem. First, the ISPs themselves do not produce the obscene content. Second, the ISPs could hardly monitor every Web site on a daily basis due to the Internet's tremendous and ever-changing content. Thus, although the European Community's method in protecting people from obscene online information differs from the United States' approach, it does not appear to promise any more success.

B. Single-Nation Legislation Is Not the Answer

It is likely that the United States Supreme Court will soon be called upon to rule on the constitutionality of a quick-fix Internet filtering law, drafted and passed by a Congress that takes an excessively narrow view of the Internet's nature. Critics argue that laws passed by any single nation will be insufficient to regulate Internet content. This view is rooted in two facts concerning the Internet's history. First, since the Internet's inception, its users "have considered it a law-free zone, or at most, subject not to national laws but only to a sort of ill-defined global or international law of cyberspace."(276) This is so because until the early 1990s, the majority of Internet users were students, academics and individuals in the computer industry.(277) Since the Internet was mainly used for non-commercial purposes during this time, an atmosphere of "anarchy, libertarianism and free speech" was created.(278) Today, Internet users represent all age groups and socioeconomic and cultural backgrounds. Second, material on the Internet routinely crosses national boundaries unmonitored and in high volume.(279) Consequently, no single state can effectively regulate it.(280)

If a multinational approach becomes necessary, who should decide what content is inappropriate for viewers of various ages? Which country's ethical norms should control the Internet's content? The beauty of the Internet is that it provides anyone with access a nearly boundary-free forum for the open exchange of ideas and opinions with people throughout the world. Considering the vastly diverse notions of morality held by people within the United States, how does Congress think it will single-handedly regulate a medium such as the Internet, which possesses content as varied as its users? Obviously, unilateral American legislation will not translate into an effective or desirable solution.

Single-nation Internet regulation is not the only means by which to police the Internet. Certain alternatives, such as nongovernmental regulation, have proven very influential in protecting intellectual property rights. For instance, the World Intellectual Property Organisation ("WIPO"), a United Nations Body, has helped formulate global policy and coordinate multilateral conventions on copyright and trademark issues.(281) An international convention, where countries could formulate content regulations and work out issues of liability regarding obscene and offensive material on the Internet, could provide a truly effective means of addressing the current problem. Acting together, the concerned nations could reach an international agreement on the definition of obscenity, and thus develop a unified approach toward regulating Internet content.(282)

As an additional means of implementing a workable solution to the existing problem, a multi-layered system of Internet governance has been proposed. Such a system would be composed of national and international legislation, as well as self-regulation by ISPs and online users.(283) In addition to legislation, such systems could include filters used by parents and teachers, hotlines and organizations established to report obscene content, and codes of conduct drawn up and followed by ISPs.(284)

One regulatory method which is attracting significant attention involves a global system designed to rate and filter Internet content. This regulatory scheme was described in a recent memorandum by the Bertelsmann Foundation of Germany,(285) a non-profit social policy organization associated with Bertelsmann A.G.(286) In September 1999, the Foundation held a three-day conference in Munich on Internet self-regulation, during which it presented its memorandum.(287) The plan is a product of the foundation's staff but was greatly influenced by the work of Jack M. Balkin, a Yale Law School professor and strong advocate of First Amendment freedoms.(288) Balkin told the New York Times that, like it or not, "we are going to have filters."(289) The professor added, "filtering will be an inevitable feature of the Internet, given the glut of information available and the need to protect children from potentially harmful content."(290) Thus, according to Balkin, the task at hand requires determining the best design for a filter, which simultaneously preserves civil liberties.(291)

Balkin co-developed a three-layer filtering scheme. In the first layer, web site operators around the world would voluntarily describe their content using a standard set of "descriptors."(292) Balkin "hopes" web site operators in different countries will apply these descriptors uniformly, thus creating a reliable form of self-rating.(293) The second level consists of groups, ranging from the National Rifles Association to the ACLU, who would voluntarily create templates.(294) These templates would combine and rank combinations of the content descriptors, thereby producing a filter that reflects the group's ideology.(295) Internet users would subsequently choose a template, and their browsers would use the descriptors and templates to determine which sites to block.(296) At the third stage, groups could release "white lists" of approved sites that were not appropriately filtered in the second layer.(297) For example, a white list of news sites would override a filter that blocks news sites reporting violent content.(298)

Balkin believes this scheme would preserve civil liberties because it would produce filters from a wide variety of groups.(299) Under this method, the government could not attempt to require the use of a particular filter, say in a public library, "because there is no standard template."(300) Moreover, such governmental action would amount to "thought control."(301)

Many free speech advocates do not share Balkin's faith in this system.(302) They feel it would have the opposite effect, making it easy for the government to require every web site in their jurisdiction to self-rate.(303) For instance, John Perry Barlow, co-founder of the Electronic Frontier Foundation, believes the scheme would facilitate government-mandated filtering.(304) Additionally, the ACLU's Barry Steinhardt states, "[s]elf rating schemes will punish those sites who choose not to self-rate" and it "creates a risk of rendering a variety of speakers invisible."(305) Furthermore, Marc Rotenberg, executive director of the Electronic Privacy Information Center, believes "[a] diversity of filters is not a diversity of viewpoints. It is a collection of fears and prejudices."(306) Professor Balkin responds to these concerns by arguing that the risk of the government taking advantage of this scheme would not prevent the design of effective filters that protect freedom of speech while also protecting young Internet users. According to Balkin, a fight against governmental filtering misuse is "best carded out through political and legal pressure, rather than by simply opposing technological development."(307)

The Bertelsmann Foundation's scheme is basically similar to Balkin's idea. The focal point of the former's proposal is an international system, under which Web publishers would voluntarily rate their sites so that the ratings could be read by blocking software.(308) The proposal encourages the emergence of numerous different filters, reflecting a range of cultural values and political ideologies.(309) As such, parents could then choose which filter suits their beliefs.(310) According to the Foundation, both the ratings initiative and the conference were intended to "stave off government regulation of Internet content, while striking an appropriate balance between free speech concerns and the need to protect children."(311) A spokesman told the New York Times, "We want to enable parents, and no one else, to make choices about the content their children see."(312) It does not appear that they considered how teachers and librarians would use the filters.

Free speech advocates voice the same concerns about the Bertelsmann proposal as they do about Balkin's system. Specifically, they fear that an international rating and filtering system will facilitate government requirements that web sites self-rate and computers in their jurisdiction use filters.(313) Moreover, free speech advocates worry that the fear of being filtered out will prompt Web publishers into making their content more mainstream in order to avoid government censure.(314) Such self-censorship would kill the uniqueness of the Internet as a medium of communication.

Considering the number of web sites on the Internet, however, as well as their changing content, the speed with which new sites are added, and Congress' zealousness in requiring filtering, it becomes clear that existing filters are not up to the task. Thus, a new filter such as the one described by Professor Balkin may very well be necessary. As a result, parental options would not only increase, they would also improve. However, a filtering problem would still exist in schools and libraries. While new technology and a global approach are steps in the right direction, more is needed.

B. Libraries' Proactive Responses

Both on a local level and on a daily basis we are faced with the problem of utilizing existing technology in order to protect children from indecent material on the Internet. While we wait for nations to join together and address the problem at its roots there are solutions we can employ in our own communities. Several American public libraries have instituted programs geared to protect their young patrons. For instance, Canton, Michigan's public library implemented the Cyber Kids Program ("CKP") in 1996.(315) Among its four components, the CKP has an awareness and approval program. Initially, parents and children sign a "Cyber Kids Consent Form,"(316) whereby the library warns parents about Internet material which they may not want their children to see.(317) The form also allows the parents to acknowledge that the library cannot and will not act in loco parentis.(318) The form requires that the child agree to follow the library's policy. Children who sign the form receive a Cyber Kids sticker.(319)

Thereafter, the library staff conducts an introduction to the Internet for the parents and children. The staff offers safety guidelines and rules for using the library's computers. For instance, children are instructed always to wear headphones and display their Cyber Kid sticker in a pocket on the computer.(320)

The Canton Library also has a detailed public relations plan, including city-wide mailings and local media coverage. This plan not only advertises the program, but also reinforces membership rules and guidelines.(321) Finally, the library has paid careful attention to the physical layout of the Cyber Kids room. It is near the reference desk and easy to locate.(322)

If a library perceives a need to filter it can ensure adults' unrestricted Internet access by selectively installing and using a filter. Adult terminal users must be warned that no filter is completely accurate in what it excludes. Hence, if they use a filter, it is impossible to determine what information is not received. Parents also need to know whether their children use filtered terminals. Such terminals often fail to block sites that parents deem inappropriate for their children.(323)

Whether or not library plans include filtering, they must focus on awareness, parental involvement, and user responsibility. Carefully planned and rigorously maintained programs can help public libraries avoid litigation over a child's access to inappropriate or illegal material. Although these programs may not eliminate the problem of unmonitored Internet content, they can go a long way toward protecting children and securing adults' rights.

IX. CONCLUSION: THE FUTURE OF INTERNET FILTERING

Despite the recent victory in Mainstream Loudoun it appears quite likely that the federal government will soon require public libraries that receive federal funding to install filters on their computers. Notwithstanding some positive signs on the state level, recent Supreme Court decisions suggest that the Court will find such a law to pass constitutional muster, as long as it is narrowly tailored to fit the goal of protecting our youth. Due to greater awareness of this problem, Congress may follow commentators in recognizing that it is important to pursue the subject of Internet regulation on a multi-national level. Such an approach would provide the sole means of attaining effective and fair regulation of Internet content without delegating the task to ISPs or software manufacturers.

With or without a more effective Internet regulatory scheme, parents need to become more involved with their children's experiences on the Internet--at home, at school and in the library. Parents cannot expect others to regulate what their children are exposed to. Nor should they want those who may not have their children's best interest in mind to do so. Moreover, public libraries must also act rather than remain idle until Congress or their state legislature informs them whether or not to filter.

There are alternatives to the pending legislation and the ineffective filters on the market. For Congress, these options include drafting well-researched and well-informed Internet laws that will not constrict the latter's development. Indeed, this choice may require slightly more time and effort than is needed when an Internet filtering bill is pushed through Congress with little or no floor debate. In the end, however, such laws could be more enduring, effective and respectful of everyone's rights. For the rest of us, we must safeguard and cherish what the Internet allows us to exercise on a global basis--our right to free speech.

(1.) See Reno v. ACLU, 521 U.S. 844, 850 (1997). In this landmark case, the Supreme Court states that the number of Internet "`host' computers--those that store information and relay communications--increased from about 300 in 1981 to approximately 9,400,000" by 1996, with roughly sixty percent of those in the United States. Id. About forty million people used the Internet in 1996 and that number was expected to grow to 200 million by 1999. See id.

(2.) See Blocking Software Bills, TECH L.J. (visited Nov. 21, 1998) <http://www.techlawjournal.com/congress/blocking/Default.html>. See also The Children's Internet Protection Act, H.R. 896, 106th Cong. (1999); The Children's Internet Protection Act, S. 97, 106th Cong. (1999); The Child Protection Act of 1999, H.R. 2560, 106th Cong. (1999); The Neighborhood Children's Internet Protection Act, S. 1545, 106th Cong. (1999), infra Section III B.

(3.) See Mainstream Loudoun v. Board of Trustees of the Loudoun County Library, 24 F. Supp. 2d 552 (E.D. Va., 1998). See also Mainstream Loudoun v. Loudoun County Library (Blocking Software Case), TECH L.J. (visited Mar. 1, 1999) <http://www.techlawjournal.com/courts/loudon/Default.html> [hereinafter Blocking Software Case].

(4.) Mainstream Loudoun, 24 F. Supp. 2d at 556.

(5.) John Stuart Mill, Of the Liberty of Thought and Discussion (1859), reprinted in ETHICS, INFORMATION AND TECHNOLOGY READINGS 7, 8 (Richard N. Stichler & Robert Hauptman eds., 1998).

(6.) See Blocking Software Case, supra note 3.

(7.) The Censorware Project, Loudoun County, VA Censorware Lawsuit, (last modified Jan. 12, 1999) <http://censorware.org/legal/loudoun> [hereinafter Censorware].

(8.) Loudoun County Public Library, "Policy on Internet Sexual Harassment," (visited Nov. 21, 1998) [hereinafter Original Policy] <http://www.lcpl.lib.va.us./wwwpol.htm>.

(9.) See Blocking Software Case, supra note 3.

(10.) See Original Policy, supra note 8.

(11.) See Mainstream Loudoun v. Board of Trustees of the Loudoun County Library, 2 F. Supp. 2d 783, 787 (1998). 42 U.S.C. [sections] 1983, modeled on section two of the Civil Rights Act of 1866, reads in relevant part:
 Every person who, under color of any statute, ordinance, regulation,
 custom, or usage, of any State or Territory or the District of Columbia,
 subjects, or causes to be subjected, any citizen of the United States or
 other person within the jurisdiction thereof to the deprivation of any
 rights, privileges, or immunities secured by the Constitution and laws,
 shall be liable to the party injured in an action at law, suit in equity,
 or other proper proceeding for redress ...


42 U.S.C. [sections] 1983 (1998). In Mitchum v. Foster, the Supreme Court wrote, "The very purpose of section 1983 was to interpose the federal courts between the States and the people, as guardians of the people's federal rights--to protect the people from unconstitutional action under color of state law ..." 407 U.S. 225, 242 (1972).

(12.) See Mainstream Loudoun, 2 F. Supp. 2d at 787.

(13.) See id.

(14.) See id.

(15.) See id.

(16.) Id. The library's policy for unblocking a site at a patron's request will be discussed below. See infra Part II section B.

(17.) ACLU Sues Library Over Use of Internet Filtering Software, 15 No. 10 Andrews Comp. & Online Indus. Litig. Rep. 8 (West, Feb. 17, 1998).

(18.) See id.

(19.) See Mainstream Loudoun, 2 F. Supp. 2d at 787; see also Blocking Software Case, supra note 3.

(20.) See Mainstream Loudoun, 2 F. Supp. 2d at 788; see also Bogan v. Scott-Harris, 118 S. Ct. 966, 969 (1998). There the Court held that city council members acted in a legislative capacity when they voted to adopt an ordinance eliminating the respondent's department, and were therefore entitled to absolute immunity. See id.

(21.) See Mainstream Loudoun, 2 F. Supp. 2d at 788.

(22.) Id. at 789.

(23.) Id; see also Va. Code Ann. [sections] 42.1-35 (Michie 1999).

(24.) See Mainstream Loudoun, 2 F. Supp. 2d at 789.

(25.) Id. 47 U.S.C. [sections] 230(c)(2) states that:
 [n]o provider or user of an interactive computer service shall be held
 liable on account of ... any action voluntarily taken in good faith to
 restrict access to ... material that the provider or user considers to be
 obscene ... or otherwise objectionable, whether or not such material is
 constitutionally protected.


Id.

(26.) See Mainstream Loudoun, 2 F. Supp. 2d at 789.

(27.) See id. at 790. See also Zeran v. America Online Inc., 129 F.3d 327, 330 (4th Cir. 1997). In Zeran, the Fourth Circuit stated that it is "another form of intrusive government regulation of speech" to impose tort liability on Internet service providers for the communications of others. Mainstream Loudoun, 2 F. Supp. 2d at 790 (quoting Zeran, 129 F.3d at 330). The court said Congress, with [sections] 230, sought to maintain the "robust nature of Internet communication." Id.

(28.) See Mainstream Loudoun, 2 F. Supp. 2d at 795. See also Censorware, supra note 7.

(29.) See Censorware, supra note 7.

(30.) See Mainstream Loudoun, 2 F. Supp. 2d at 792.

(31.) See id.

(32.) Id.

(33.) See id.

(34.) Id. (quoting Board. of Ed. v. Pico, 457 U.S. 853, 856 (1982)).

(35.) See id. at 792.

(36.) See id. (citing Pico, 457 U.S. at 864-69).

(37.) Id. (citing Pico, 457 U.S. at 866 (quoting Griswold v. Connecticut, 381 U.S. 479, 482 (1965))).

(38.) Id. (citing Pico, 457 U.S. at 869).

(39.) See id.

(40.) See id. at 793 (citing Pico, 457 U.S. at 876-79 (Blackmun, J., concurring)).

(41.) See id.

(42.) See id. (citing Pico, 457 U.S. at 877-78 (Blackmun, J., concurring)).

(43.) See id.

(44.) See id. (citing Pico, 457 U.S. at 879 (Blackmun, J., concurring)).

(45.) See id.

(46.) See id. (citing Pico, 457 U.S at 888 (Burger, J., dissenting)).

(47.) See id. (citing Pico, 457 U.S at 889 (Burger, J., dissenting)).

(48.) See id.

(49.) See id.

(50.) See id.

(51.) See id.

(52.) See id.

(53.) See id. at 794 (citing Pico, 457 U.S. at 876-79 (Blackmun, J., concurring)).

(54.) See id.

(55.) See id. at 795 (citing Pico, 457 U.S. at 876) (Blackmun, J., concurring) (quoting Bd. of Ed. v. Barnette, 319 U.S. 624, 642 (1943)).

(56.) See id. at 795-96.

(57.) See id. at 795.

(58.) See id. "The First Amendment's central tenet [is] that content-based restrictions on speech must be justified by a compelling governmental interest and must be narrowly tailored to achieve that end." Id. See also Simon & Schuster, Inc. v. Members of the N.Y. State Crime Victims Bd., 502 U.S. 105, 118 (1991).

(59.) See Censorware, supra note 7.

(60.) See id.

(61.) See Blocking Software Case, supra note 3.

(62.) See id.

(63.) See id.

(64.) See id.

(65.) Mainstream Loudoun, 24 F. Supp. 2d at 565.

(66.) See id. at 567.

(67.) See Mainstream Loudoun, 2 F. Supp. 2d at 796.

(68.) See id. (citing Reno v. ACLU, 521 U.S. 844 (1997)). The Child Online Protection Act, enacted in October 1998, prohibits the commercial distribution of pornographic material to minors on the Web. See 47 U.S.C. [sections] 231 (1999).

(69.) See Mainstream Loudoun, 2 F. Supp. 2d at 796.

(70.) See id. (quoting Reno v. ACLU, 521 U.S. 844 (1997)) (quoting Denver Area Telecomm. Consortium v. FCC, 518 U.S. 727 (1996)).

(71.) See id. The Board had to prove that a real problem existed, that there was a real need in Loudoun County for limiting minors' access to online pornography, and that blocking software, however inaccurate, was the only means to do so. See id.

(72.) See id.

(73.) See id.

(74.) See id. The policy was overinclusive because the filters blocked adults from reading protected speech on the grounds that it was unfit for minors. See id.

(75.) See Mainstream Loudoun, 2 F. Supp. 2d at 796.

(76.) See id.

(77.) See id.

(78.) See id.

(79.) See Mainstream Loudoun, 2 F. Supp. 2d at 797.

(80.) See id.

(81.) See Mainstream Loudoun, 24 F. Supp. 2d at 568. See also Censorware, supra note 7.

(82.) See id. The underlying idea is that only judges are to determine the protected status of speech, not administrators such as members of the local library board. See id.

(83.) See id.

(84.) See Notice of Appeal of the Loudoun County Library, copied from the case file at the U.S. District Court, E.D. Va., Alexandria.

(85.) See Mainstream Loudoun, A Voice for Moderation, Internet Policy Lawsuit, (visited Jan. 20, 2000) <http://loudoun.net/mainstream/ Library/Internet.htm>.

(86.) See id.

(87.) See id.

(88.) Loudoun County Public Library Internet Use Policy (adopted Dec. 1, 1998) <http://censorware.org/legal/loudoun/981201_internetpol.lcpl.htm>.

(89.) Id.

(90.) See Jeri Clausing, Committee Adds Internet Filtering Amendment to Budget Bill (June 26, 1998), <http://www.nytimes.com/library/tech/ 98/06/cyber/articles/26filter.html>.

(91.) See Praveen Goyal, Congress Fumbles with the Internet: Reno v. ACLU, 521 U.S. 844 (1997), 21 HARV. J.L. & PUB. POL'Y 637, 638-41 (1998).

(92.) See id. at 645.

(93.) See Reno v. ACLU, 521 U.S. at 844.

(94.) See id. at 882.

(95.) See id. at 878; see also Censorware, supra note 7.

(96.) See Reno v. ACLU, 521 U.S. at 844.

(97.) See Jeri Clausing, Congress Aims to Wrap Up Internet Issues (last modified Sept. 1, 1998), <http://www.nytimes.com/library/tech/ yr/mo/cyber/articles/01congress.html>.

(98.) Four hundred fifty-three bills containing the word "Internet" were the result of an online search for bills in the 106th Congress conducted on March 29, 2000 on Thomas, (legislative information on the Internet provided by the Library of Congress, <http://thomas.loc.gov>).

(99.) See id.

(100.) See S. 97, 106th Cong. (1999).

(101.) See Blocking Software Bills, supra note 2.

(102.) See S. 97, 106th Cong. (1999).

(103.) See id. As originally introduced, the bill required the purchase but not necessarily the installation of a filter. It has since been amended.

(104.) Id.

(105.) See id.

(106.) See S. 97, 106th Cong. (1999). See also 4 CYBER LAW 23 (Jul./Aug. 1999).

(107.) Id. The bill was amended on August 5, 1999, and reads, "[a bill] [t]o require the installation and use by schools and libraries of a technology for filtering or blocking material on the Internet on computers with Internet access to be eligible to receive or retain universal service assistance." S. 97, 106th Cong. (1999).

(108.) See Summary, supra note 106.

(109.) See id.

(110.) See id.

(111.) See id.

(112.) See id.

(113.) See H.R. 2560, 106th Cong. (1999).

(114.) See H.R. 896, 106th Cong. (1999).

(115.) See id.

(116.) See H.R. 1501, 106th Cong. (1999).

(117.) See S. 1545, 106th Cong. (1999).

(118.) See id.

(119.) See Child Online Protection Act of 1998, 47 U.S.C. [sections] 231 (1999).

(120.) See Jeri Clausing, Meanwhile, Next Door, a Debate Over Internet Censorship (Sept. 12, 1998) <http://www.nytimes.com/library/tech/98/09/ cyber.../12decency.html>.

(121.) See David Morgan, Internet Exec Sees Anti-Porn Law As An Opportunity, (visited Jan. 24, 1999) <http://www.dailynews. yahoo.com/headlines/wr/story/html?s=v/nm/19990124/wr/porn_7.html>.

(122.) See id.

(123.) See id.

(124.) See id. In November, 1998, the ACLU led seventeen plaintiffs including booksellers, gay rights groups, medical professionals and the media into federal court and secured a preliminary injunction that blocked enforcement of the act until February 1, 1999. This injunction has since been extended pending the outcome of a judicial hearing. See 27 MEDIA L. REP. 1449 (1999).

(125.) See Clausing, supra note 120, see also Lawrence Lessig, Reading the Constitution in Cyberspace, 45 EMORY L.J. 869 (1996) (discussing how the CDA, with current Internet traits, would have silenced much speech among adults).

(126.) See Clausing, supra note 120.

(127.) See KAREN G. SCHNEIDER, A PRACTICAL GUIDE TO INTERNET FILTERS (1997).

(128.) See Emily Whitfield & Ann Beeson, Censorship in a Box: Blocking Software is Wrong for Libraries, 16 No. 7 CABLE TV & NEW MEDIA L. & FIN. 1 (Sept. 1998).

(129.) See ACLU, Online Censorship in the States, (visited Jan. 8, 2000) <http://www.aclu.org/issues/cyber/censor/stbills.html> [hereinafter Online Censorship]. The states were California, Illinois, Kansas, Tennessee, Missouri, Kentucky, New York, Ohio, Rhode Island and Virginia. See id.

(130.) See id.

(131.) See ARIZ. REV. STAT. ANN. [sections] 34-502 (West 1999).

(132.) Id. at [sections] 34-502(B)(1).

(133.) See Whitfield & Beeson, supra note 128, at 1.

(134.) See id. For example, in ALA v. Pataki, 969 F. Supp. 160 (S.D.N.Y. 1997), the court struck down a New York online indecency law because it violated the Commerce Clause which prohibits states from regulating speech wholly outside their own borders and from imposing inconsistent state burdens on speakers. See Online Censorship, supra note 129. In ACLU v. Miller the court struck down on free speech grounds a Georgia law that made it a crime to communicate anonymously or with a pseudonym on the Internet. See 977 F. Supp. 1228 (N.D. Ga., 1997).

(135.) See ACLU Press Release, In Legal First, Appeals Court Strikes Down State's "Harmful to Minors" Ban on Internet Speech (Nov. 3, 1999) <http://www.aclu.org/news/1999/n110399b.html> [hereinafter Legal First].

(136.) See The Censorware Project, NYC Schools Censored, (visited Jan. 5, 2000) <http://www.censorware.org>.

(137.) See id.

(138.) See Clinton Administration Outlines Its Electronic Commerce Policy: White House Responds to Internet's Uniqueness, 14 NO. 3 COMPUTER L. STRATEGIST 1 (July 1997).

(139.) See id. The other three principles of the Administration's framework are: (1) "any government intervention should be limited to ensuring competition, protecting intellectual property and privacy, preventing fraud, fostering market transparency, supporting commercial transactions and facilitating dispute resolution;" (2) the Internet is unique because of its characteristic decentralization and "bottom-up leadership;" and (3) "the Framework adopts the principle of facilitating global commerce." Id.

(140.) See id.

(141.) See id.

(142.) See id.

(143.) Id. at 6.

(144.) See id.

(145.) See id.

(146.) See id.

(147.) See Matt Richtel, Filter Used by Courts Blocks Innocuous Sites, N.Y. TIMES ON THE WEB (June 23, 1998) <www.nytimes.com/library/tech/ 98/06/cyber/articles/23filter.html>.

(148.) See Clausing, supra note 90.

(149.) See id. Even conservative groups are concerned about Internet filtering. On October 18, 1999 several groups sent a joint letter to Thomas J. Bliley, Chair of the House Commerce Committee, complaining that filters block many sites devoted to constitutional rights, such as the site of Gun Owners of America. The group also expressed concern about teachers and librarians being able to set up software to block material that conflicts with their own personal political agenda. They also argued that passing filtering legislation would set a dangerous precedent for federal government mandates to any group receiving federal funds to censor such things as pro-Second Amendment sites, sites belonging to tobacco companies, or religious groups' sites. See Electronic Freedom Foundation, Even Conservative Groups Oppose Mandatory Content Filtering, (last modified Oct. 18, 1999) <http://www.eff.com/more.html>.

(150.) See Internet Free Expression Alliance, Joint Statement for the Record on Legislative Proposals to Protect Children from Inappropriate Materials on the Internet (visited Sep. 12, 1998) <http://www.ifea.net/joint_statement _9_98.html.>. The IFEA is a coalition of more than 20 national free speech, journalism, art and computer industry organizations formed in the fall of 1997 "to insure that the Internet remains the uniquely powerful public forum lauded by the Supreme Court in last summer's decision striking down the Communications Decency Act." See id. Along with the ACLU, the group includes the American Library Association, the Association of American Publishers, Feminists for Free Expression and the Society of Professional Journalists. See id.

(151.) See id.

(152.) See id.

(153.) See id.

(154.) See id.

(155.) See Clausing, supra note 120.

(156.) See M.

(157.) See id.

(158.) See id.

(159.) See Legal First, supra note 135.

(160.) See id.

(161.) See id.

(162.) See Clausing, supra note 90.

(163.) See id.

(164.) See SCHNEIDER, supra note 127, at xv.

(165.) See Whitfield & Beeson, supra note 128.

(166.) See id. at 2.

(167.) See id.

(168.) See id.

(169.) See id.

(170.) See id.

(171.) See id.

(172.) See SCHNEIDER, supra note 127, at 3-12.

(173.) See id. at 3. "Keyword blocking" is also called content identification or analysis. See id.

(174.) See id.

(175.) See id.

(176.) See Whitfield & Beeson, supra note 128, at 3.

(177.) See SCHNEIDER, supra note 127, at 4.

(178.) See id. Schneider's research team discovered that when a filter accesses a programmed keyword in the body of a poem, it does one of four things, depending on the brand of filter: (1) stops the file in transit; (2) displays the file but obscure the target term; (3) delivers some but not all of the file; or (4) shuts down the browser or even the computer. See id.

(179.) See id. at 6. Most are denial lists. See id.

(180.) See id. at7-8.

(181.) See id. at 9-12.

(182.) See id. at 9.

(183.) See id. at 9-10.

(184.) See id. at 10.

(185.) See id. at 10-12.

(186.) See id. at 12.

(187.) See supra, Part V.

(188.) See The Censorware Project, Passing Porn, Banning the Bible: N2H2's Bess in Public Schools (visited Jan. 5, 2000) <http://www.censorware.org> [hereinafter Passing Porn].

(189.) See id.

(190.) See id.

(191.) See Steve Lawrence & C. Lee Giles, Accessibility of Information on the Web, NATURE, July 8, 1999, at 107.

(192.) See id.

(193.) See Passing Porn, supra note 188, at 7.

(194.) See id.

(195.) See id.

(196.) See id.

(197.) See id.

(198.) See id.

(199.) See Library Sued for Not Limiting Net Access, 15 NO. 2 COMPUTER L. STRATEGIST 2 (June 1998).

(200.) See id.

(201.) See id.

(202.) See The Censorware Project (last modified Jan. 14, 1999) <http:// censorware.org>.

(203.) See Library Sued for Not Limiting Net Access, supra note 199.

(204.) See The Censorware Project, supra note 202.

(205.) See SCHNEIDER, supra note 127, at xi.

(206.) See id. The book also has a website with up-to-date information on filtering software, <http://www.bluehighways.com/filters/>. See id. at xii.

(207.) See id. at xii.

(208.) Id. Schneider describes the power and significance of the Internet as a tool for communication and self-expression when she says, "In no other medium except, perhaps, a New York subway, do we see so many examples of human strengths and frailties jostling side by side." Id. at xiii.

(209.) Id.

(210.) Id.

(211.) See SCHNEIDER, supra note 127, at xiii.

(212.) See id.

(213.) See id. at xvi.

(214.) See id.

(215.) See Whitfield & Beeson, supra note 128.

(216.) See id.

(217.) See SCHNEIDER, supra note 127, at 13.

(218.) See id. at 14.

(219.) See id.

(220.) See Statement of the American Library Association to the Senate Commerce, Science and Transportation Committee on Indecency on the Internet for the Hearing Record (Feb. 10, 1998) <http://www.ala.org/ washoff/mccain.html> [hereinafter ALA Statement]. The ALA is the oldest and largest association of librarians in the United States. See id.

(221.) Id.

(222.) See id.

(223.) See id. (citing Reno v. ACLU, 521 U.S. 844 (1997)).

(224.) See ALA Statement, supra note 220.

(225.) Id. In the spirit of John Stuart Mill, the ALA statement went on to note that:
 Today's children are growing up in a global information society. It is
 imperative that they learn critical viewing and information skills that
 will help them make good judgments about the information they encounter ...
 Simply blocking offensive and unwanted content will not teach students
 those critical skills.


Id.

(226.) See Nadine Strossen, Academic and Artistic Freedom, in ETHICS, INFORMATION AND TECHNOLOGY READINGS 45, 50 (Richard N. Stichler & Robert Hauptman eds., 1998).

(227.) See Tim O'Brien, The Rehnquist Court: Holding Steady on Freedom of Speech, 22 NOVA L. REV. 711, 713-14 (Spring 1998).

(228.) Id. at 714-15.

(229.) See 521 U.S. 844, 849, 882-85 (1997). Reno v. ACLU, in which the Court found the Communications Decency Act to be unconstitutional, has been called the "legal birth certificate of the Internet." O'Brien, supra note 227, at 722 (quoting Edward Felsenthal & Jared Sandberg, High Court Striks Down Internet Smut Law, WALL ST. J., June 27, 1997, at B1).

(230.) Reno v. ACLU, 521 U.S. at 885.

(231.) 505 U.S. 377 (1992).

(232.) O'Brien, supra note 227, at 724 (quoting R.A.V., 505 U.S. at 381-96),

(233.) Strossen, supra note 226, at 50.

(234.) Organizations such as the ACLU and the American Jewish Congress also opposed the St. Paul ordinance. See O'Brien, supra note 227, at 726 n. 103.

(235.) See See also Rust v. Sullivan, 500 U.S. 173 (1991). Strossen, supra note 226, at 50.

(236.) See Strossen, supra note 226, at 50..

(237.) See id. at 50-51.

(238.) Id. at 51.

(239.) See id.

(240.) See id.

(241.) Strossen, supra note 226, at 51.

(242.) See Summary, supra note 106.

(243.) See Strossen, supra note 226, at 53.

(244.) See id. Supreme Court Justice Oliver Wendell Holmes may have provided the best example of speech that would not be protected-falsely yelling "Fire!" in a crowded theatre, thereby creating a panic. See id.

(245.) 501 U.S. 560 (1991).

(246.) See id. at 572.

(247.) Strossen, supra note 226, at 53.

(248.) See Barnes, 501 U.S. at 570.

(249.) Strossen, supra note 226, at 53.

(250.) Id. at 54.

(251.) Strossen rightly calls censorship a "politically effective `quick fix' which creates the illusion that government leaders are addressing the problems" such as sexual assault, domestic violence or discrimination." Id. at 62.

(252.) Reno v. ACLU, 521 U.S. at 894 (O'Connor, J., concurring in part, dissenting in part).

(253.) See, e.g. F.C.C. v. Pacifica Foundation, 438 U.S. 726, 738-51 (1978).

(254.) See Henry H. Perritt, Jr. & Randolph R. Clarke, Chinese Economic Development, Rule of Law, and the Internet, 15 GOV'T INFO. Q. 393, 394 (1998).

(255.) See Chinese Govt Monitors Internet Activities of Dissidents, Yahoo!News (Asia), (Feb. 15, 1999) <http://asia.yahoo.com> [hereinafter Dissidents].

(256.) See Mark Landler, An Internet Vision in Millions: China Start-Ups Snare Capital as Auction Fever Boils Up, N.Y. TIMES, Dec. 23, 1999, at C1. Landler reports that a recent Gallup survey found that just 14% of China's 1.2 billion people had heard of the Internet. See id. at C4.

(257.) See Perritt & Clarke, supra note 254.

(258.) See id. at 406.

(259.) See id.

(260.) See id. at 407-08. In early 1999, the Chinese government set up a task force to monitor online subversive activities. See Dissidents, supra note 255. This special monitoring committee's job is to identify suspected dissidents and limit their use of the Internet in communicating with human rights activists overseas. See id.

(261.) See Perritt & Clarke, supra note 254, at 407.

(262.) See id. at 394.

(263.) See id. at 396. Microsoft, IBM and Lucent Technologies are currently developing Chinese language voice recognition and translation software that will greatly facilitate China's access to the rest of the world via the Internet. See id. at 406.

(264.) See id. at 395-96. The authors believe the Internet can help China establish the "role of law" and that it is the "quickest and cheapest way to provide for the basic ingredients [of the rule of law] of process transparency and decisional rationality," which they see as necessary for China's economic growth. Id. at 400.

(265.) See id. at 408-10.

(266.) See Landler, supra note 256.

(267.) See id. Sina.com is a Beijing-based company with the most popular site in China. Sohu.com is a popular Chinese search engine. See id.

(268.) See id. at C4.

(269.) See id.

(270.) See id.

(271.) See id.

(272.) See Lilian Edwards, Defamation and the Internet, in LAW & THE INTERNET: REGULATING CYBERSPACE 183, 197-98 (Lilian Edwards & Charlotte Waelde eds., 1997).

(273.) See id. An example of such a ratings scheme is PICS (Platform for Internet Content Selection), which provides a "neutral labeling" system similar to that used to depict films in TV magazines. See id. In the U.S., parents can choose to have the Web rated. They can download ratings software, such as that available online at Adequate.com. Adequate.com's Parental Guidelines program rates websites, other than news, online research, or directory sites. There is a disclaimer that if no rating is displayed on a site, it has not yet been rated. The site also warns parents to be aware that Web sites may change their content after the rating has been added. See Adequate.com, Parental Guidelines (visited March 29, 2000) <http://www.adequate.com>.

(274.) See Yaman Akdeniz, Governance of Pornography and Child Pornography on the Global Internet: A Multi-layered Approach, in LAW & THE INTERNET: REGULATING CYBERSPACE 223, 235 (Lilian Edwards & Charlotte Waelde eds., 1997).

(275.) See id. After the IWF informs a British ISP that it has located undesirable content, "[t]he ISP concerned then has no excuse in law that it is unaware of the offending material." Id.

(276.) Lilian Edwards & Charlotte Waelde, Introduction, Law & THE INTERNET: REGULATING CYBERSPACE 4 (Lilian Edwards & Charlotte Waelde eds., 1997).

(277.) See Edwards, supra note 272, at 187. These individuals consisted of "technophiles, students, academics and workers ... principally in the U.S." Id.

(278.) Id. Simultaneously, there existed "a strong ... dislike of corporate, governmental or legal authority or control." Id.

(279.) See id. Because of the Web's tremendous volume and rapid growth, it is impossible to estimate its size. See id.

(280.) See id. This regulatory ineffectiveness permits "a plaintiff or pursuer to work out where he or she may, and perhaps may most advantageously, raise any action." Id.

(281.) See Edwards & Waelde, supra note 276, at 6-7.

(282.) See Edwards, supra note 272, at 197. This is not to suggest that it will be easy for an international group to formulate a common standard of "obscene" or "objectionable" material. Colleges in the United States are not even able to develop workable Internet use policies for their own students. See Pamela Mendels, Universities Grapple with Computer Use Policies, N.Y. TIMES ON THE WEB, at B10 (March 3, 1999) <http: //www.nytimes.com/library/tech/99/03/cyber/education/03eduation.html>

(283.) See Akdeniz, supra note 274, at 225.

(284.) See id.

(285.) See Carl S. Kaplan, Yale Law Professor Is Main Architect of Global Filtering Plan, N.Y. TIMES ON THE WEB (Sept. 10, 1999) <http://www.nytimes.com/library/tech/99/09/cyber/cyberlaw/10law.html>.

(286.) See id. Bertelsmann A.G. is one of the world's largest media conglomerates. See id.

(287.) See id.

(288.) See id. Professor Balkin is Knight Professor of Constitutional Law and the First Amendment at Yale Law School. See id. He also runs the Information Society Project at Yale. See id.

(289.) Id.

(290.) Kaplan, supra note 281.

(291.) See id. As stated by Jens Waltermann, deputy head of Bertelsmann A.G.'s media division, "[W]e are immensely grateful to him as a free speech advocate to be brave enough to think about empowering parents to make choices, while protecting free speech to the maximum extent possible." Id.

(292.) See id.

(293.) See id.

(294.) See id.

(295.) See Kaplan, supra note 285.

(296.) See id.

(297.) See id.

(298.) See id.

(299.) See Kaplan, supra note 285. According to Balkin, "[T]he system is set up to make a thousand flowers bloom." Id. (quoting Balkin).

(300.) Id. (quoting Balkin).

(301.) Id.

(302.) See id.

(303.) See id.

(304.) See Kaplan, supra note 285. Barlow believes that Balkin is "creating a dangerous tool ... [G]ive the government a tool and they will use it." Id. (quoting Barlow).

(305.) Id. Steinhardt is an associate director of the ACLU and also attended the Munich conference. See id.

(306.) Id. Rotenberg is "very uneasy about an architecture of the Internet that enables the widespread dissemination of those prejudices." Id. (quoting Rotenberg).

(307.) Id. (quoting Balkin).

(308.) See Pamela Mendels and Carl S. Kaplan, Summit to Discuss Global System for Rating Internet Content, N.Y. TIMES ON THE WEB (September 9, 1999) <http://www.nytimes/library/tech/99/09/cyber/articles/09ratings.html>.

(309.) See id.

(310.) See id.

(311.) Id. The initiative seeks "a maximally open, maximally free speech protective system of rating and filtering." Id. (quoting Waltermann).

(312.) See id. (quoting Waltermann).

(313.) See Mendels and Kaplan, supra note 308.

(314.) See id. This scheme may prompt ISPs to "tone down their material in ways they might otherwise not feel compelled to do." Id., quoting David Sobel, general counsel at the Electronic Privacy Information Center. See id.

(315.) See SCHNEIDER, supra note 127, at 74-6.

(316.) See id.

(317.) See id.

(318.) See id.

(319.) See id.

(320.) See SCHNEIDER, supra note 127, at 74-6.

(321.) See id.

(322.) See id.

(323.) See id. at 77.

Geraldine P. Rosales(*)

(*) J.D. Candidate 2000, Rutgers School of Law-Newark. The author wishes to thank her husband, Mario, and her parents, Walter and Patricia Barcomb, for their unfailing love and support.
COPYRIGHT 2000 Rutgers University School of Law - Newark
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2000 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Mainstream Loudoun v. Board of Trustees of Loudoun County Library
Author:Rosales, Geraldine P.
Publication:Rutgers Computer & Technology Law Journal
Geographic Code:1USA
Date:Mar 22, 2000
Words:15461
Previous Article:Universal health identifier: invasion of privacy or medical advancement?
Next Article:Thirty-second selected bibliography on computers, technology and the law.
Topics:


Related Articles
INTERNET FILTERS IN US LIBRARIES RULED UNCONSTITIONAL.
LIBRARY TAKES PCs OFFLINE OVER FREE SPEECH RULING.
Library Internet filtering policy unconstitutional, court rules.
Child Net Protection Act Would Require Net Filters.
Filtering: Just Another Form of Censorship.
Filtering sexual material on the Internet.
Not in my backyard.

Terms of use | Privacy policy | Copyright © 2022 Farlex, Inc. | Feedback | For webmasters |