Printer Friendly

What are the major means of leverage?

The above discussion illustrates the depth of the challenge faced in developing effective cybersecurity. It also shows the diversity, ubiquity, and importance of cyberspace components and demonstrates that cyberspace includes important elements that might not at first glance be considered part of it. Given that diversity and complexity, one approach would be to restrict attention to those components associated with particularly high levels of risk.

Two limits to such an approach are, first, a focus solely on those components known currently to be at high risk could quickly become obsolete. While there are currently many known vulnerabilities which, if addressed, would substantially improve cybersecurity, future or currently undiscovered vulnerabilities may come from unexpected places. Cybercriminals and cyberterrorists would likely seek out new vulnerabilities as current ones are eliminated--writers of "nuisance" viruses have been doing that for several years. In many ways, cybersecurity involves a kind of arms race, with adversaries and defenders each adapting successively to actions by the other. This arms race is likely to continue as long as information technology and cyberspace continue to evolve at current and expected rates.

Second, some would argue that such a focus would simply be an extension of the current fragmented approach, which is largely reactive--as each new vulnerability is discovered, a new fix is developed--and increasingly costly and ineffective. What is needed, they say, is a strategic approach that is more preventive or even preemptive in nature rather than largely reactive and defensive. (76) Some argue that the best approach is to reduce the incentives for catastrophic attack, (77) rather than focusing on preventing all attacks (if experience with cyberspace so far is any indication, this may be impossible or certainly impractical). Such an approach would suggest a focus on (1) limiting damage, and (2) recovery.

To be effective, any preventive approach should probably be broadly applicable to different organizations and systems. The interconnectedness of cyberspace gives it some of the characteristics of a commons--a kind of public resource for which, in the absence of appropriate controls, costs of use by any individual are distributed broadly to the community of users. Classically, using a limited resource--such as pastureland or a fishery--as a commons promotes overuse and degradation of the resource. It pays each individual to maximize his or her use of the resource--to graze as many cattle or catch as many fish as possible--no matter the consequences to the resource as a whole. This effect has been called the "tragedy of the commons." (78) In cyberspace, costs of poor security are often distributed, because compromised systems may be used in attacks on others, with little impact on the compromised system (see above). In addition, however, those costs may be amplified--a naive user may compromise the integrity of an entire network. (79)

There are several options for broadly addressing weaknesses in cybersecurity such as those discussed in the previous section. The following options will be discussed in this section:

* adopting standards and certification,

* promulgating best practices and guidelines,

* using benchmarks and checklists,

* use of auditing,

* improving training and education,

* building security into enterprise architecture,

* using risk management, and

* using metrics.

This discussion is followed by a brief consideration of the role of economic incentives.


The broad adoption of established standards, or the development and adoption of new ones, could be one way to improve cybersecurity. One widely used definition of standards is "a prescribed set of rules, conditions, or requirements concerning definitions of terms; classification of components; specification of materials, performance, or operations; delineation of procedures; or measurement of quantity and quality in describing materials, products, systems, services, or practices." (80) As this rather eclectic definition illustrates, there are many different kinds of standards. (81) They may be classified according to purpose--e.g., product, process, testing, or interface standards. They can also be classified according to their focus--commonly, a distinction is made between performance standards, which focus on function, and design standards, which specify features, dimensions, or other such characteristics. A third classification is based on how standards are developed and implemented. They may be developed through consensus or some other process. They may be implemented voluntarily, or they may also be imposed, for example by law, and therefore mandatory. Voluntary consensus standards are common, and federal law encourages their use by federal agencies, including DHS. (82) Standards may also be open or proprietary, but different observers define "open standard" somewhat differently (83). Some form of open standards is the approach used typically by major standards organizations.

Which kinds of standards to adopt will very much depend on the goals identified and the characteristics of specific elements. In general, design standards or detailed regulation usually cannot evolve readily in parallel to an evolving technology. Given the rapid evolution of information technology, there appears to be agreement that their use should be avoided for elements that are not yet mature if appropriate results can be obtained through more flexible approaches, such as performance standards or best practices.

Several organizations are involved in the development of cybersecurity standards. NIST performs a wide array of standards-related activities, including promoting the global use of U.S. standards, providing information and technical support to industry and others, coordinating the development of national voluntary product standards, accrediting testing laboratories, and developing standards for use by federal agencies where no acceptable industry standards exist. (84) The American National Standards Institute (ANSI) is a private, nonprofit organization that administers and coordinates the U.S. voluntary private-sector standardization system. (85) ANSI and NIST coordinate activities through a memorandum of understanding. (86) Among ANSI's activities related to cybersecurity are its Information Systems Conference Committee, which provides a forum for communication among IT standards developers, and the Information Infrastructure Standards Panel, which identifies standards critical for global information infrastructure. While ANSI also has established a homeland security standards panel, cybersecurity is not among the panel's areas of focus. NIST activities include the Process Control Security Requirements Forum, which is developing security requirements for industrial process control systems. (87) Among other U.S. organizations engaged in standards activities related to cybersecurity are the InterNational Committee for Information Technology Standards (88) and the Institute of Electrical and Electronic Engineers. (89) The Trusted Computing Group (90) is a group of IT manufacturers, vendors, and others formed in April 2003 to develop open industry hardware and software standards for trusted computing, an important element of cybersecurity. The Internet Engineering Task Force (ITEF) (91) is an international group of experts and others involved in the development and operation of the Internet; participation is open to any interested person.

The International Organization for Standardization (ISO), (92) a nonprofit network of national standards organizations from various countries, is the major international standards developer. The International Electrotechnical Commission (IEC) (93) develops standards relating to electronic technologies. Together they have established a Joint Technical Committee on Information Technology (JTC1), (94) with a subcommittee on security techniques (JTC1 SC27) (95) that develops generic standards relating to IT security.

Current Standards. Several sets of standards have been developed for use in cybersecurity. Three of the most widely cited are the Common Criteria for Information Technology Security Evaluation (usually called the Common Criteria, abbreviated CC); ISO/IEC 17799, an internationally recognized information security standard; and the Federal Information Processing Standards (FIPS), which were developed by NIST for use by federal systems. These are each discussed below. A wide range of international standards also exist for specific security techniques, such as encryption, authentication, nonrepudiation, and time stamping. (96)

Product Evaluation. The Common Criteria consist of a set of evaluation criteria for the security of information technology that was developed by U.S., Canadian, and some European government agencies. It resulted from a recognition of the need to harmonize separate evaluation criteria that had been developed by different countries. (97) It was also adopted as an international technical standard (ISO/IEC 15408) in 1999. The CC provides a framework for the development of standard sets of requirements, called profiles, to meet specific needs of consumers and developers of information technology products, depending on the assurance levels that they require. (98) A set of protection profiles may be developed for different kinds of products (such as a firewall) or general applications (such as electronic fund transfers) that may be evaluated. (99) The profiles lay out security objectives and requirements. For example, a profile developed for Department of Defense firewalls describes the security environment to which the profile applies, threats to be addressed, security objectives, functional and assurance requirements to meet those objectives, and the rationale for how the requirements meet the objectives and how the objectives address the threats. (100) Once developed, a profile may be evaluated by an accredited, independent laboratory. More than 40 profiles have been developed for a range of products and systems, and most have received evaluations.

For a specific application, (101) a set of security requirements and specifications (102) is developed, usually conforming to one or more relevant protection profiles if available. The application is then evaluated to determine if it meets those requirements and specifications, and if so, it may be certified for use in the specified environment. Products may be evaluated to any of seven hierarchical evaluation assurance levels (EALs), which reflect successively higher levels of security. Both software and hardware products have been certified under the CC. They include operating systems, databases, firewalls, computer chips, smartcards, and routers, among others. (103) More than 100 applications have been evaluated at EAL1 to EAL4+. (104)

Although the CC are often referred to as standards, there are aspects of them that are not easily characterized as standards, at least according to some observers. The notion of criteria is broader than that of standards because it generally includes things, such as statements on how a system should be designed and operated, that cannot be directly assessed by examining the product. (105) Also, protection profiles are not written into the CC but are developed and updated as needed.

Code of Practice. Several standards have been developed relating to overall information security practices. They might be used in conjunction with other guides such as the CC as elements of an overall framework for cybersecurity. There appears to be at least some agreement that a good security management standard should cover all important security issues; be comprehensive and up-to-date; be clear, unambiguous, and easy to understand and use; be practical and achievable; be scalable to any organization; and provide a basis for measurement of performance. (106)

The most widely recognized code-of-practice standards are ISO/IEC 13335 and ISO/IEC 17779. The first provides broad guidelines for managing IT security (GMITS) in the context of an organization's overall management, and stresses challenges posed by the global nature of cyberspace. It addresses universal security concepts, management and planning, risk assessment, merits of alternative solutions, and external communications. It focuses on high-level concepts and general requirements and techniques, rather than specific controls. It describes IT security management as including a determination of objectives, strategies, policies and organizational requirements; managing risks; planning implementation of adequate safeguards and follow-up programs for monitoring, reviewing, and maintaining security services; and developing a security-awareness program and plans for incident-handling. It was released in parts, including five technical reports, from 1996 to 2001. A revision was begun in 2000. (107)

ISO/IEC 17799 is described by JCT1 SC27 as giving "recommendations for information security management for use by those who are responsible for initiating, implementing or maintaining security in their organization. It is intended to provide a common basis for developing organizational security standards and effective security management practice and to provide confidence in inter-organizational dealings." (108) Topics covered include

* organizational policy and infrastructure;

* asset classification and control;

* personnel, physical, and environmental security;

* communications and operations management;

* access control;

* systems development and maintenance;

* business continuity; and

* compliance. (109)

ISO/IEC 17799 is more widely recognized internationally than any other cybersecurity management standard. (110) It is related to ISO/IEC 13335 in that "17799 focuses on issues to be considered for information security management and ... 13335 addresses how to achieve [it]." (111) The standard was issued in 2000, and revision began in 2001. It is based on and virtually identical to the 1999 update of the British Standard in Information Security, BS 7799 (Part 1), which was initially published in 1995. (112)

While called a standard, ISO/IEC 17799 has been described as more similar to a set of guidelines, in that it is not written in such a way that conformance can be certified. (113) The standard contains 127 major controls and thousands of bits of guidance, but they are not presented as imperatives. (114) Thus, organizations may adapt the standard to their needs, modifying the application of some sections to fit their management structure, or discarding sections that do not apply. (115) This flexibility has been both praised and criticized. On the one hand, it means that organizations can use the standard without compromising other key business requirements. On the other hand, it makes conformance more difficult to assess. (116)

While ISO/IEC 17799 does not itself include a certification scheme, some countries have developed such schemes. Perhaps most notable is BS 7799 Part 2, developed and used in Great Britain and also available in other countries, including the United States. (117) This standard specifies requirements and controls for an organization's information security management system (ISMS) in ways that can be assessed by an accredited certification body. It has been described as consisting of requirements for an ISMS plus ISO/IEC 17799 controls (118) "in imperative format." (119) The most recent version of BS 7799 Part 2 was published in 2002. There does not appear to be any equivalent under development for ISO/IEC 17799 itself.

The Information Security Forum (ISF) has developed a code of practice, The Standard of Good Practice for Information Security. (120) ISF updates the standard every two years. It was first released in 1996, with the most recent version released in March 2003. It is based on the experience and expertise of ISF members and staff, other standards such as ISO/IEC 17799, and the results of ISF surveys. Topics covered include security management, critical business applications, computer installations, networks, and systems development.

That set of topics appears somewhat more limited in scope than the set covered by ISO/IEC 17799, but a direct comparison was not possible for this report. Each topic is organized into several areas (30 altogether), which in turn contain several sections (132 altogether). Each section contains a principle, an objective, and several specific actions or controls. The IFS standard is publicly available without charge, unlike ISO/IEC 17799. (121) IFS provides members with a survey instrument they can use to compare their performance against the IFS standard and other benchmarks, but the organization does not appear to provide certification.

The IT Governance Institute (ITGI) (122) has developed Control Objectives for Information and related Technology (COBIT), a set of recommended practices in information technology governance, control, and assurance developed through a consensus process involving experts. First released in 1996, the third edition was published in 2000. It provides a framework for IT governance, including metrics and other management tools as well as controls. ITGI does not describe COBIT as a standard but alternatively as a "framework for IT governance" (123) and a "generally accepted best practice." (124) Nevertheless, it is similar enough in both structure and method of development to the standards described above that it arguably should be considered a code-of-practice standard. Rather than specifically focusing on cybersecurity, it addresses security in the context of overall IT governance. Security is considered one of three sets of requirements, the other two being quality and fiduciary. COBIT is organized hierarchically into four domains, which are broad categories of activity such as planning, implementation, and monitoring; 34 processes; and specific activities or objectives under each process. (125) There is no certification program for COBIT, but audit and self-assessment guidelines are available. The framework has been criticized as being difficult to scale to small or medium-sized enterprises, but ITGI has developed a version aimed at such organizations. (126)

Federal Standards. NIST is responsible under federal law (127) for developing standards and guidelines for cybersecurity for federal information systems, except national security systems, which fall under the responsibility of the Committee on National Security Systems (CNSS) and the agencies that operate the systems. (128) The Federal Information Processing Standards (FIPS) are standards developed by NIST for requirements for federal systems not covered by available voluntary industry standards. (129) Some FIPS are mandatory for federal agencies, while others are not. FISMA requires NIST to "develop standards and guidelines, including minimum requirements, for providing adequate information security for all agency operations and assets," except for national security systems. (130) None of the FIPS publications to date specifically address governance issues.

FIPS are developed with rule-making procedures similar to those established by the Administrative Procedure Act. (131) Some FIPS are adopted by private sector entities. For example, the Data Encryption Standard (DES--FIPS 46), introduced in 1977, provides a method for cryptographic protection of information. It was widely adopted by industry, for example in the financial services sector. The newer, stronger Advanced Encryption Standard (AES--FIPS 197), adopted in 2001, is now replacing DES as applications are developed.

In its series of special publications on computer security, (132) NIST has published a set of generally accepted system security principles and practices (133) (sometimes called GAPP), discussed earlier in this report, that are similar in scope to ISO/IEC 17799, and the two are sometimes considered to be competing standards. No general certification scheme exists for this set of practices. There are also several other NIST publications on various aspects of cybersecurity, such as capital planning, system development, security awareness and training, and so forth. (134)

NSA has established an Information Assurance Technical Framework Forum (IATFF) (135) to develop a framework for solutions to information assurance problems encountered by federal agencies and industry. A framework document (136) available through the forum provides technical guidance for protecting information and information infrastructure using NSA's defense-in-depth strategy.

Strengths and Weaknesses of Standards. The widespread use of well-established and well-designed cybersecurity standards would have potential benefits. Such standards would provide a common language and criteria for determining how well organizations are adhering to recognized security needs and requirements. In addition, as the use of the standards increased, the overall level of security would arguably rise as well. Also, the standards would presumably provide a common baseline from which continuous improvement in cybersecurity could be implemented through the evolution of the standards.

However, the use of standards in cybersecurity has also been criticized by some. Some common criticisms are described below:

They are not sufficiently flexible and cannot track changes in the technology. International standards are often updated on a three- to five-year cycle. Given the rate of evolution of cyberspace, some observers have complained that standards become outdated too quickly to be useful for cybersecurity. Proponents counter that properly developed standards are in fact sufficiently flexible that they can accommodate the technological improvements that are likely to occur between revisions. International standards such as ISO 17799 are often revised on a three- to five-year cycle. Both COBIT and the ISF standard are updated on a two-year cycle. The Common Criteria Development Board is charged with issuing updates and corrections to the CC. (137)

They can be expensive to conform to. If certification is available, as with BS 7799 Part 2, the process of becoming certified may be expensive, especially for smaller enterprises. Even without certification, organizations adopting standards may find they need to significantly alter business practices, possibly at considerable expense and sometimes in ways that are not in keeping with the optimum business model for the particular enterprise. Proponents counter that, while return on investment may be difficult to measure directly, the process of coming into compliance can help organizations identify and correct serious cybersecurity deficiencies, and protect them from large expenditures to recover from a success attack or from loss of reputation that can be very difficult to regain.

They are too much like regulation. If adherence to a particular set of standards becomes expected, then certification bodies might take on some of the characteristics of regulators, with the attendant benefits and disadvantages. Proponents may counter that such need not be the case, especially if the standards and certification are well-designed, there are sufficient alternative paths to certification to avoid the development of effective monopolies, and compliance is voluntary, as it is with most standards.

The mixed success of the Common Criteria illustrates some of these reported pitfalls. These include a lack of flexibility, despite attempts to build flexibility into the CC; the inability to keep pace with evolving technology; and cost and time required for certification. (138)

Measuring success may be difficult for code-of-practice standards. "High-level" code-of-practice standards such as ISO/IEC 17799 have been criticized for not being specific enough to provide sufficient guidance or a sufficient common basis for measuring and comparing practices among different organizations. At the same time, BS 7799 Part 2 has been criticized for being too much of a checklist and insufficiently adaptable to different kinds of enterprises. Proponents counter that such critics misunderstand the application of the standards--that comparable metrics can be developed and that certification can readily be adapted to the requirements of a particular enterprise. CC, COBIT and other standards have been criticized for being difficult to scale, especially to the needs of smaller organizations that may not have a primary IT focus. Attempts have been made to compensate for this problem. For example, ITGI has developed a form of COBIT specifically designed for smaller enterprises. Despite such concerns, the advantages of code-of-practice and other cybersecurity standards appear to be sufficient that their use is increasing (see below).

The development process may be cumbersome. Some of the criticisms associated with standards result from the particular methods by which most standards are developed. For example, the ANSI process includes "consensus on a proposed standard by a group or 'consensus body' that includes representatives from materially affected and interested parties; broad-based public review and comment on draft standards; consideration of and response to comments submitted by voting members of the relevant consensus body and by public review commenters; incorporation of approved changes into a draft standard; and right to appeal by any participant that believes that due process principles were not sufficiently respected during the standards development in accordance with the ANSI-accredited procedures of the standards developer." (139) The designated "consensus body" is required to be balanced with regard to different interests. Consensus does not require unanimity but does require "substantial agreement ... by directly and materially affected interests ... [and] that all views and objections be considered, and that an effort be made toward their resolution." (140) This process, which may require several meetings, ensures that the interests of all involved parties are taken into account, but it can be slow and may require compromises that can lead to more complex standards.

In contrast, the Internet Engineering Task Force (IETF) develops standards through a process that is performed largely online. Interested parties form a working group to identify the scope of the standard and begin developing it. Participation in the working group is completely open to anyone interested, but there is no active attempt to guarantee a balance among different interests. Drafts of the standard are posted online and comments incorporated. Once the group reaches a "rough consensus," defined as agreement by a "very large majority" of the working group, (141) the draft is sent to the Internet Engineering Steering Group (IESG) for independent review by experts. After successfully passing review, the draft may become a standard through some additional steps. According to some observers, the use of a fully open, online process, rough consensus, and independent review results in "cleaner" standards and a more rapid process than the more traditional approach taken by most standards bodies.


Certification usually refers to a formal approval by some entity, such as a laboratory, that a product, process, or person meets a specified set of criteria. For example, an electrical product may be certified as meeting safety standards. A physician may be certified as meeting a particular level of competency in an area of specialization. The certifying entity may be accredited by a recognized authority such as a government agency or professional association. Accreditation may also refer to the approval of a certified product for use in a particular system (142) or it may refer to the authorization to use a particular information system and accept the attendant risks. (143)

Certification processes exist for both product evaluation and code of practice standards. For example, products can be certified under the CC, as discussed above. Other product evaluation certifications have also been developed. For instance, the Technology Group for The Financial Services Roundtable (BITS) runs a security-certification program for products used by the financial services industry. (144) The criteria used follow the general scheme laid out in the CC. For code of practice, certification is available in many countries, including the United States, under BS7799 Part 2. The number of those certifications has been increasing substantially, especially in Asia, (145) with more than 800 organizations certified worldwide, although only a few in the United States. (146)

Professional certification is also available from some organizations. For example, the Information Systems Audit and Control Association (ISACA) (147) offers certification for information security auditors and managers, and the International Information Systems Security Certification Consortium (148) offers certification for information security professionals. Such certifications usually require several years of relevant professional experience, successful completion of an examination process, adherence to a code of conduct, and continuing education in the field.

Strengths and Weaknesses of Certification. Certification can be an important component of any attempt to adhere to a set of established standards. That is because it provides a means of independent verification that criteria set by the standards have been met. Many of the criticisms of standards discussed above, and counters to them, can be applied to certification as well.

The strengths and weaknesses of certification can be illustrated by ISO/IEC 17799 and the CC. If a certification were available for ISO/IEC 17799, companies that claim to have adopted it could demonstrate that they have been assessed by an independent, accredited body as conforming to its requirements. However, they would not be free to adapt the standards however they wished to their particular operating situations and needs. A product certified under the CC can be used with confidence in the kinds of environments to which the certification applies. However, the certification process is expensive and time-consuming, increasing the costs of the products and potentially impeding the adoption of newer technologies.

There also does not appear to have been any systematic assessment of the effectiveness of certification under standards such as BS7799 Part 2 with respect to improving cybersecurity. That may be in part because the certification has been available for only a few years. There are at least two ways that success could be measured and that different standards and methods of compliance could be compared. (149) First, the incidence of security problems (including but not limited to attacks) would be expected to be lower for organizations using the most effective standard and compliance method. That measure may be hard to use as long as organizations are reluctant to reveal security breaches or other problems, as has been reported. (150) Another, more indirect metric would be the relative success of different certifications. Presumably, an organization that finds a particular certification to be effective would be more likely to renew it--or to purchase additional products certified under it--than switch to another or discontinue use. However, other factors, such as cost, can also influence the relative success of different certification regimes.

Best Practices

Best practices often refers to strategies, policies, procedures, and other action-related elements of cybersecurity that are generally accepted as being the most successful or cost-effective. Such practices can be identified for virtually any of the elements of a cybersecurity framework, from goals to specific procedures or specifications.

Unfortunately, there does not appear to be any overall agreement on what constitutes a best practice. The term implies that the practice in question has been assessed as being superior to all others, but the basis of such assessments, if provided, usually appears, at best, to be a consensus among experts, rather than a rigorous empirical comparison of alternatives. In fact, it is not uncommon in the literature for a set of "best practices" to be asserted with no description of what criteria were used to identify them as best. Given the vagueness associated with the use of this term, it might be more appropriate to refer instead to commonly accepted or generally accepted practices, at least where there is evidence to that effect. (151)

What is called a set of best practices can vary greatly in content and method of development. At one extreme are standards developed through a well-established methodology, such as the code of practices contained in ISO/IEC 17799 or COBIT. At the other extreme, a set of "best practices" might simply be recommendations from one person published in a newsletter article. Best practices may be developed specifically for one sector or industry. For example, the Network Reliability and Interoperability Council (NRIC) has developed a set of more than 150 cybersecurity best practices for the communications industry. (152) Most of these are fairly general, such as "disable unnecessary services" but some are much more specific. However, they are intended to address classes of problems rather than providing "[d]etailed fixes to specific problems...." NRIC used an "industry consensus" approach to develop them, stressing that a practice is included only after "sufficient rigor and deliberation" including "[d]iscussions among experts and stakeholders" about whether the practice is implemented widely enough, its effectiveness and feasibility, the risk associated with failing to implement it, and alternatives. NRIC proposes that these practices be used as recommendations and not as requirements and that they be adapted to the individual needs of the organization using them. In another example, the ASP Industry Consortium produced a set of white papers, prepared by the consortium's security subcommittee, that include about 25 best practices for network and platform security. (153) The practices described are fairly general, such as "use remote access sparingly." The methodology by which they were developed is not described.

Another group of best practices with relevance to cybersecurity is known as capability maturity models (CMM). Essentially, these are practices, arranged along a hierarchy of maturity levels, designed to help organizations identify the level at which they operate processes for the development of software and other products and to improve those processes by successively improving to higher levels of maturity. (154) The system has been developed as a joint public-private partnership initiated by the Department of Defense in the 1980s. One example is "cleanroom software engineering"--procedures based on mathematical verification of designs and statistical testing of systems that are designed to produce highly reliable software that has a minimum of errors and vulnerabilities. For applications where security considerations are a priority, techniques have been developed to engineer systems to the appropriate level of security corresponding to the specific needs for the application. Such systems are designed with carefully specified requirements and are thoroughly reviewed and tested before implementation. (155)

Best practices would not necessarily be associated with a certification or audit process, so it can be difficult to determine if an organization is in fact conforming to them effectively. However, they generally provide a degree of flexibility and adaptability that may not be available with more formal approaches. Furthermore, they may be easier to update in response to the rapid evolution of technology, cyberspace, and the threat environment.


Guidelines may be thought of as general recommendations relating to elements of cybersecurity. They are not necessarily associated with any particular methodology or criteria, in contrast to standards and (at least in theory) best practices, other than the authority of those making the recommendations. One commonly cited set of guidelines is the Guidelines for the Security of Information Systems and Networks of The Organization for Economic Cooperation and Development, first adopted in 1992 and most recently revised in 2002. The nine basic principles contained in the guidelines are intended to provide a foundation for the development of a "culture of security." The principles focus on the importance of awareness of and responsibility for security, the importance of timely responsiveness to security incidents, the role of ethical considerations and democratic values, the need for risk assessments, security as an essential design element for information systems, the need for comprehensive security management, and the importance of continual review and reassessment. Many of these principles are also reflected in other documents, including ISO/IEC 17799.

Generally Accepted Information Security Principles. GAISP is an attempt to draw together a hierarchical set of principles that have been reviewed by experts in information security and that meet specified criteria. The project was initiated by the Information Systems Security Association, an international, nonprofit association of information security professionals. GAISP consists of "principles, standards, conventions, and mechanisms that information security practitioners should employ, that information processing products should provide, and that information owners and organizational governance should acknowledge to ensure the security of information and information systems." (156) It is intended to provide a basis for self-regulation for the profession, analogous to the Generally Accepted Accounting Principles (GAAP) used by Certified Public Accountants. (157) The hierarchical approach aims to provide guidance that can be applied at various levels within an organization, from executive governance to daily management of security risks.

Basel Principles. The financial services sector has been among the leaders in developing and implementing components of a cybersecurity framework. The Basel Committee on Banking Supervision has released a set of guidelines called Risk Management Principles for Electronic Banking. (158) While seven of the fourteen principles and practices described in the document relate to security controls, (159) the Basel principles are particularly notable for the degree to which they stress the importance of institutional leadership and the management of legal and reputational risk in the context of cybersecurity. For example, the first three principles place responsibility for active oversight of cybersecurity management directly on boards of directors and senior management. The principles relating to legal and reputational risk focus on information disclosure, protection of customer data, including privacy, and continuity of service.

The difference between guidelines and best practices is not perhaps as distinct as the difference between either of those and standards. While guidelines may provide even greater flexibility and adaptability than best practices, their general lack of specificity may make effective implementation more challenging. As with best practices, guidelines would not necessarily be associated with a certification or audit process, so it might be difficult to determine if an organization is in fact conforming to them effectively.

Benchmarks and Checklists

Fundamentally, a benchmark is simply a reference point against which performance is measured. It might be used as a goal, or it might be considered a level of minimum acceptable performance. The latter might also be called a baseline. With respect to computers, a benchmark often refers to a test used to compare one or more aspects of performance of a system (such as processing speed) with other systems or with a specified level of function.

With respect to cybersecurity, the terms benchmarks and checklists are more often used to denote sets of security configurations and settings that are recommended to achieve a specified level of security. One well-known set provides minimum security configurations for the Microsoft Windows 2000 operating system. Developed through a consensus process involving federal agencies and private organizations, (160) it was released by the Center for Internet Security (CIS) in 2002. (161) Security configuration benchmarks have also been developed for other operating systems, application software, and some hardware. (162) NIST has developed a program to devise security checklists for software and hardware used by federal agencies. (163) The Defense Information Systems Agency (DISA) (164) and NSA also produce configuration guidance documents.

Producing an effective set of code-of-practice benchmarks is arguably more difficult than producing technical configuration guidance. One example of a set of code-of-practice benchmarks was developed by the Human Firewall Council, a consortium of information security professionals. Called the Security Management Index, it is now managed by ISSA. (165) Based on ISO 17799, it permits organizations to perform self-assessments, via completion of a survey, to determine how well they conform to the objectives in the standard in comparison to other organizations that have participated.

Benchmarks and checklists can be an important element of a cybersecurity framework but are by their nature very specific and limited in scope. Also, some confusion may result from the occasional use of the term as a synonym for standards.


Auditing is often thought of as a formal examination of financial or accounting records, but it is also used in a broader sense, such as to denote independent examination of an organization's adherence to established controls, policies, or legal requirements. (166) An organization may undergo, for example, a security audit of its information systems. That may involve an examination of hardware, software, procedures, configurations, environment, and user practices. An audit may be performed by the organization itself, or it may be performed by an independent auditor, usually a firm that specializes in accounting and auditing. Audits usually follow a set of established practices and procedures, such as the Statement on Auditing Standards No. 70 (known as SAS-70) issued by the Auditing Standards Board of the American Institute of Certified Public Accountants (AICPA). (167) Information security audit guides have also been developed for government agencies. (168) An audit usually involves testing of controls and results in a report that includes the opinion of the auditor about the adequacy of the controls examined, with recommendations for improvements. It does not result in a certification of conformance to a standard. However, auditors may be expected to conform to established standards in the conduct of an audit. (169)

Auditing methods and requirements are most well developed with respect to financial and accounting processes. As a result, some audits might tend to underemphasize aspects of cybersecurity that are not related to those processes. The results of audits might also vary significantly among different auditors. The Sarbanes-Oxley Act of 2002 (P.L. 107-204) requires audits of financial controls, including information security controls, for publicly traded companies.

Training and Education

If, as some observers believe, people are the most important element of effective cybersecurity, then training and education should be an important means of leverage to improve cybersecurity. Inadequate cybersecurity practices by users, IT personnel, and even corporate leadership have been widely cited as a major vulnerability. (170) The NSSC lists national cyberspace security awareness and training as one of its top five priorities. Elements include a comprehensive national awareness program and support for training, education, and professional certification. (171) The National Cyber Security Alliance (NCSA) has been established as a public-private partnership of government agencies, corporations, and nongovernmental organizations to promote cybersecurity education and awareness. (172)

Many factors can influence the effectiveness of training and education to enhance cybersecurity. For example, programs and materials vary in quality, and poorly designed program is unlikely to provide significant improvements in cybersecurity. In addition, training may not be able to compensate sufficiently for a poor system design.

Enterprise Architecture

Effective cybersecurity needs to focus not only on the individual elements of an organization's information technology but also how they interact. The term enterprise architecture (EA) has become increasingly used to refer to the components of an organization and how they work together to achieve the organization's objectives. Specific definitions and usage vary. EA is often used specifically to refer to the information technology component of the architecture, and especially to the interoperability of those components. It is also used to denote a blueprint of an organization's business operations and the technology required to perform those operations. (173) The federal government is developing a "business-driven" EA to improve interoperability and services. (174)

An organization can characterize its EA to assist in planning and development of its information technology. Such a characterization can provide an opportunity to make security an integral part of EA. This component of EA is sometimes called the security architecture. (175) However, even the initial characterization of an organization's EA can be time-consuming and expensive, and the costs of reengineering to build in security may be prohibitive for many organizations. In addition, the need to build a business case to justify IT investments, which is often considered important to the EA approach, may create barriers to improving security, given the traditional difficulties of demonstrating a financial return on investments in security.

Risk Management

The approach embodied in defense-in-depth recognizes that security cannot be perfect but rather reduces the risk and impact of a successful attack or other breach. Such reduction can be captured through risk management, which involves identifying, controlling, and mitigating threats, vulnerabilities, and the impacts of security breaches. The steps in effective risk management include assessment of risk, steps to mitigate them, and continuous evaluation and adjustment. The approach often involves cost-benefit analysis to help determine optimal mitigation steps. Mitigation may involve accepting the risk as a cost of business; avoiding risk associated with a particular activity, for example by not engaging in it; limiting the risk through effective use of controls; and transferring the risk, for example through insurance. (176) Some insurance companies have begun to offer cybersecurity policies, under which companies can transfer some of their risks in the event of a successful attack. Carriers may require clients to implement specified security practices to qualify for insurance. However, in the absence of reliable actuarial data about the risks and costs of cyberattacks, it may be difficult for carriers to set appropriate insurance rates.

To be effective, risk management requires accurate risk assessment. However, many cybersecurity risks may be difficult to assess, for reasons discussed earlier. In addition, a risk management approach may lead an organization to accept risks for which the potential impacts of security events are low, regardless of external impacts. (177) Thus, risk management is not likely to sufficiently address cybersecurity problems associated with the commons properties of cyberspace discussed earlier in this report.


Whatever approaches are used to improve cybersecurity, measuring their success would appear to be essential to determining how effective they are and to making improvements. However, fundamental problems exist with measuring success in security. Seemingly, the most appropriate measure is the number of successful attacks, but in fact, attacks--especially the kind of major attack for which effective defense is critical--may be comparatively uncommon, so that absence of a successful attack may not indicate effective security. (178) In addition, attackers often take steps to avoid detection, so an absence of detected attacks may in fact be a measure of poor rather than good security. This conceptual problem might be addressed through the use of proxy measures, such as how well technology, policy, and activities conform to certain accepted benchmarks, as well as the use of proficiency testing, such as blind "red team" attacks or other penetration testing.

Not only is it difficult to identify appropriate metrics for cybersecurity, there are also risks of distortions that may be associated with any particular metric. Virtually any given metric will measure only one or a limited number of aspects of a goal. If, however, the limitations of the metric are not understood, attempts to use it to optimize security can lead to distortions, as the above example illustrates. This appears to be a general concern. (179) However, some argue that using even distorted metrics can be beneficial if the process of measuring them focuses attention on problems or deficiencies and leads to correction.

Metrics relating to the effects of security events are called impact metrics. Those relating to the delivery of security services are called effectiveness or efficiency metrics; and those relating to the execution of security policies are called implementation metrics. NIST has published guidelines on such metrics, to assist agencies in complying with federal requirements. (180) The document does not urge the adoption of any specific set of metrics, although it does provide examples. Instead, it recommends that the metrics chosen use data that can be realistically obtained, that measure existing, stable processes, and that facilitate the improvement of security implementation. The kinds of metrics that can be effectively gathered will depend on the level of maturity of the security program. Programs at low levels of maturity will of necessity be limited to using implementation metrics. Impact metrics can be effective for organizations that have mature security programs, with fully integrated procedures and controls. (181)

Economic Incentives

Implementation of cybersecurity measures may involve substantial costs and is therefore sensitive to market forces and other economic factors. If sufficient economic incentives exist for improving cybersecurity, then organizations are likely to make the investments needed in the absence of government regulation or other drivers. One concern often raised is that economic incentives are often insufficient, and that in fact, significant counterincentives exist.

The perceived inadequacy of incentives for cybersecurity can be seen as a form of market failure--a kind of economic inefficiency. (182) There are several lines of evidence supporting this view. For example, it can be difficult for law enforcement officials to arrest and prosecute hackers if companies are unwilling to provide information on cyberattacks, yet a company risks suffering significant reputation costs if that information leads customers to conclude that the company's information systems are not sufficiently secure. In addition, investments in cybersecurity cannot easily be analyzed in terms of return on investment, since they do not contribute to income in a measurable way. (183) Therefore, companies may be reluctant to make the necessary investments. Also, impacts of compromised systems may reach far beyond the system where the compromise occurred (184)--the interconnectedness of cyberspace has made it to a significant extent a commons, with associated economic externalities.

The widespread adoption of the kinds of leverage to improve cybersecurity discussed above may be doubtful without changes in the current incentive structure. Such changes could arise from several sources. Among them are increases in public demand for cybersecurity, changes in expected behavior within a sector regarding investment in cybersecurity, (185) public-private partnerships, and regulation or other action by governments. While not all such factors are themselves economic in nature, they can clearly affect the economic incentive structure. For example, a company that does not respond to expectations from its peers for improved cybersecurity may suffer a significant reputation cost. Similarly, a company that is found to violate government requirements may suffer both reputation costs and direct punitive action or may be held financially liable for damages.

Eric A. Fischer

Senior Specialist in Science and Technology

Resources, Science, and Industry Division
COPYRIGHT 2005 Congressional Research Service (CRS) Reports and Issue Briefs
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2005 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Creating a National Framework for Cybersecurity: An Analysis of Issues and Options
Author:Fischer, Eric A.
Publication:Congressional Research Service (CRS) Reports and Issue Briefs
Article Type:Column
Geographic Code:1USA
Date:Feb 1, 2005
Previous Article:Where are the major weaknesses in cybersecurity?
Next Article:What roles should government and the private sector play?

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters |