Printer Friendly

Locking up open systems.

CHALLENGE IS INHERENT IN SEcuring open systems. The term open system implies that information is easily accessible to users, while security implies that access should be limited and controlled. Can the two concepts work together? They have to, and the security part has to be relatively invisible until needed.

Security, privacy, and integrity are related terms that apply to the policies, or mechanisms that protect us, our sources, and our assets from ourselves or from outside intruders. In this article, security refers to the protection of information assets to prevent exploitation through interception, unauthorized access, or other intelligence threats. Privacy refers to the nondisclosure of information to people without an authorized need to know, and integrity refers to the assurance that information has not been changed or modified.

The underlying meanings of the terms are the same for industry and government. The emphasis, however, is different. The Department of Defense (DoD) is more concerned with the security or privacy of information as it pertains to national security. Banks are more concerned with the integrity of the transfer of information than the privacy of that information. Intelligence sources are more concerned with the privacy of the source of the information as opposed to the security or privacy of the information itself.

EVERY ORGANIZATION, WHETHER IN GOVernment or industry, has valuable information that requires protection. Sensitive or proprietary information can be an organization's lifeblood. The loss of that information could hamper day-to-day operations, and if the information is stolen or compromised, the result could be disastrous. A prudent organization's defense is to identify sensitive or proprietary information, quantify the value, and develop an economical, enforceable security policy to protect it.

In today's high-tech society, massive amounts of sensitive or proprietary information are electronically stored in computers, making computer security a top priority. Networked computers add to the threat by creating the potential for unknown or unauthorized users to gain access to your sensitive or proprietary information. Since both computer literacy and worldwide networks are on the rise, the likelihood of unauthorized penetration, malicious modification or destruction of information, and virus generation are magnified.

The problem is that most general-purpose computer operating systems, particularly MS-DOS and UNIX, have inherent weaknesses that can be exploited by a user, programmer, criminal, or hacker.

SECURITY MEANS DIFFERENT THINGS TO different people. The level of security in a military environment may be too restrictive for a small commercial company. However, the small commercial company understands that competition and privacy create the need for security.

Security is viewed by some as a mechanism for defeating spies and competitors who would love to know and exploit what you or your organization are doing. But remember, a good security system also protects against accidental disclosures.

Computer security means denying unauthorized persons access to information. A total security policy matches the need-to-know requirements of a user to the sensitivity of the information he or she is allowed to access. A secure trusted computer based (TCB) system--a system platform (hardware, software, firmware) that has been certified at some level of functionality and correctness of operation by the National Computer Security Center (NCSC)--addresses some security requirements, but it does not provide total security by itself.

THE DESIRE FOR RAPID ACCESS TO SHARed information has increased the need for secure and reliable communications using local are networks (LANs) and wide area networks (WANs).

In the past, multiple dedicated networks and systems were often necessary to protect sensitive information from unauthorized access. Purging systems of residual data before reuse was required whenever common equipment had to be used for different levels of security or data sensitivity. That purging was called periods processing.

Systems were operated in a protected mode in which all users accessing the system were cleared to the highest allowable security level no matter what their need-to-know requirements were. That method was called systems high mode. Those techniques are expensive, restrictive, and inefficient.

Multilevel security (MLS) systems, working within the framework of a total security policy, can provide more secure and cost-effective methods of managing information. The ability to use a single system to process information at more than one security- or data-sensitivity level by users with different need-to-know requirements is highly advantageous. That method of operation, known as multilevel security mode, greatly increases flexibility and offers distinct system cost benefits.

Security or data-sensitivity levels can be defined by DoD military standards, such as unclassified, confidential, secret, and top secret. The commercial sector, however, is probably more comfortable with such levels as company confidential, proprietary, sensitive, personnel, and payroll.

No matter what the name of the level, different levels need to be protected from access by unauthorized users. Individuals who work with payroll data, for example, have a clearance that permits access to payroll information. That clearance may or may not permit access to personnel information, however. Access to classes of information can be further controlled on most systems to include or exclude the ability to read, list, write, modify, create, and execute.

DoD has written exhaustive documentation to assist managers in determining the scope of their security problem and the criteria for resolving or minimizing risk. That documentation is called the rainbow series; each document is referred to by its color rather than its name.

The yellow book (CSC-STD-003/004-85) addresses computer security requirements and a formula for measuring users' clearance or need to know against the sensitivity of the data to be protected. The risk index derived from the formula a determines the class of TCB system required to resolve the threat.

The DoD manager can then use the orange book (DoD 5200.25-STD) to determine the criteria for that class of TCB. The manager can use the red book (NCSC-TG-005 VERSION-1) to interpret the orange book's criteria for resolving networking problems or the light blue book if it is necessary to add a computer security subsystem to an existing computer.

DoD has also published documents on encryption for authenticating users and ensuring information privacy and integrity; on criteria for eliminating electronic emanations to prevent electronic eavesdropping; and on most other forms of security. Federal laws describe the penalties for disclosing sensitive information to people without an authorized need to know. The commercial world, however, does not have


the same push to implement security.

NCSC'S CRITERIA FOR EVALUATING TRUSTed computer systems are defined in the Trusted Computer System Evaluation Criteria (TCSEC), in the orange book. Those criteria, shown in Exhibit 1, defines six classes of trusted systems, from Class C1, which offers minimal protection, to class A1, which is called verified design. In addition, class D contains systems with no protection.

Those criteria address the hardware, firmware, and software that make up the TCB, but they do not address applications on the TCB. The criteria evaluate four main aspects of TCBs: security policy, accountability, assurance, and documentation. Each TCB class builds on the evaluation criteria contained in the previous class. For example, Class C2 contains all the criteria of class C1 and introduces object reuse and audit.

The level of trust required for an application or system is a measure of risk or minimum user clearance weighed against the maximum sensitivity of data contained on the system. That measurement is illustrated in Exhibit 2

In a closed-system environment (a system in which the applications are adequately protected against the insertion of malicious logic), an unclassified user accessing a system containing top secret data would require a TCB with a level of trust of A1 to ensure that the unclassified user did not view the top secret data.

If the minimum clearance of users on that same system was secret, then the level of trust required for the TCB would be B2 (since the risk of data compromise has been reduced). By applying guidance from the yellow book, it is possible to determine the class of system necessary to protect sensitive information from registered users who do not have a need to know the most sensitive information.

TO DETERMINE THE APPROPRIATE SECUrity policy for your organization's information assets, ask yourself the following questions:

* What is your organization trying to protect? From what? From whom?

* How valuable are your information assets?

* Where is your system or network most vulnerable?

* What measures can be taken to protect it?

* Is the physical environment adequate?

* Are personnel and visitor screening practices adequate?

* Do visitors register at your facility?

* Does your organization perform character and background checks on applicants before hiring them?

* Can the cost of acting on security requirements be estimated?

* Can your organization live with the cost and scrutiny of a security breach?

Answering questions such as these helps you understand what the threats are, what you can do to protect your company, and what the costs of action and inaction are.

BEFORE DETERMINING WHAT CAPABILITIES are available or soon will be, make sure you are familiar with the following terms so you can better understand which of those capabilities may be of help.

Least privilege. This principle requires that each registered user be granted the most restrictive set of privileges needed to perform authorized tasks. The application of the principle limits the damage that can result from accident, error, or unauthorized use. In the UNIX world, for example, if an employee is responsible for system backups, he or she must have super user privileges. As a super user, the employee has access to all system resources, including the privilege to register and grant privileges to new users.

Discretionary access. This defines a method of determining who has access to the information. You can liken it to the "TO:" or "CC:" on a memo. As the owner of a file of information, you determine which persons and what groups have access to your information and what that access involves.

In the UNIX world, you can give people or groups read privileges, write privileges, and executive privileges. UNIX enforces this discretionary access policy. Remember, however, that a super user has all privileges, including the privilege to bypass discretionary access.

Labeling. This allows the system to tag all information assets and users with labels that describe the sensitivity of the information and the need-to-know permissions of users.

Mandatory access. This is the ability of a system to match user access rights with the sensitivity of the information. If the user's sensitivity does not equal or exceed the sensitivity of the information, the system will deny access even if the owner of the information has granted discretionary access permission to that user. Mandatory access depends on sensitivity labeling of users and data to determine access rights.

Accountability. This describes the ability of the system to hold users accountable for their actions and usually involves the ability to audit system-unique


events for real-time and later analysis. An audit provides an account of who is on the system, how long he or she has been on, what system resources he or she has used, and other similar information. It also provides an account of users who abuse or attempt to abuse their privileges on the system.

Accountability has been around in time-sharing environments for quite a while, as it provides the mechanism for billing clients for system services. Remember, if you don't have enough information about users on your system to bill them for the services the system provides, you probably don't have a very secure system.

Assurance. This term is used to describe how comfortable the organization is with the security mechanisms on its systems. Class B2 through class A1 in the orange book are oriented more toward assurance and documentation of security features than to additional security functions.

Integrity. This assures users that the information has not been modified or changed.

THREE ESPECIALLY HELPFUL TOOLS available to help maintain security are trusted computer bases, encryption, and smart cards.

Trusted computer bases. Activity in the TCB world is considerable. Many vendors are involved in certification efforts with NCSC and the National Institute of Standards and Technology to offer TCBs that provide various levels of protection. Many vendors have built TCBs or are currently in the evaluation process.

Encryption. Many encryption products or algorithms are available today that operate externally or internally and provide a multitude of security services. The most popular encryption algorithms in the commercial world is the Data Encryption Standard (DES). DES is the adopted standard for the banking industry and was approved for use in the commercial sector as a type II capability. Type I is reserved for DoD's sensitive information.

Chip sets also have been developed under the Commercial COMSEC (communications security) Endorsement Program (CCEP). One chip set provides encryption that protects communications between peer entities on a LAN.

Encryption offers the following security benefits:

* Privacy of information. Information scrambled by encryption techniques is tough to unscramble unless you have the same algorithm and share the private key or a key that matches the key that was used to scramble the information. Sensitive information that has been encrypted can be communicated across public networks with the same degree of security you would have on a private, protected network.

* Integrity. This is a byproduct of encryption. When information is encrypted with your key and decrypted at the receiving end with the private version of your key, you are assured that the information has not changed because the unscrambling of the data would not work if the information had changed. This integrity is in addition to the integrity of the communications protocol used to transport the data.

* Digital signature. This is a technique used to compute a "checksum" of the information, encrypt the checksum, and attach the encrypted checksum to the information. A checksum is another integrity test. It is arrived at by dividing the total space available by the amount of space used by the data.

So by comparing the original checksum with your computation, you can detect any changes in the information. This technique is useful, for example, to ensure that information stored under the control of an untrusted data base manager has not changed or that the sensitivity label attached to the information has not been changed.

Laws are being considered that would equate a digital signature with a personal signature. Digital signatures are required in DoD so that when a message arrives you can be assured that it came from the person it was supposed to come from and that the contents of the message have not been modified or changed.

Smart cards. When a user logs onto a system, the first step is to go through a log-on process that authenticates him or her as a legitimate user. This process usually involves a user ID and a password associated with that ID.

Smart cards take the process one step further. Some versions ask the user to furnish a fixed number available on the card and a random number on the card that changes every 30 to 60 seconds.

Inside the card is a microprocessor that generates the random number. The number is based on the same time algorithm used at the central system to validate the random number given by the user. The central system can usually validate up to 4,096 separate smart cards.

The user is this validated by something in his or her possession, something in his or her head (the user ID and password), and a profile on the system. The profile details all of the user's access rights and privileges and controls the user's access to system resources. This process is extremely valuable for users who access the system by dialing up on public networking services.

Some smart card implementations are placed in the terminal during the connection. The algorithm on the card is used to encrypt and decrypt information exchanged between the user and the system, providing a degree of security and integrity. The technology can also be applied to parking privileges, physical access to specific facilities, and guard tours.

ALSO CONSIDER THE FOLLOWING TECHNIQUES for your computer security program:

* Limiting terminal access. If a super user can operate only at certain terminals inside the facility, both inside and outside unauthorized users have another hurdle to overcome in attempting to achieve super user status. Think about partitioning the super user into multiple functions so that administration, security, and operations are subsets.

* Auditing. Multiple unsuccessful log-on attempts could represent unauthorized access attempts. Knowledge that events are audited prevents some illegal access attempts.

* Limiting log-on attempts. Maybe your system should limit the number of successive unsuccessful log-on attempts. Some unauthorized users have determined valid IDs and passwords through repeated attempts on the system.

Remember that a comprehensive security policy is multifaceted. It should cover all aspects of your operations--TEMPEST (Transient Electromagnetic Pulse Emanations Stadard) requirements, physical security, protection and privacy of communications, locks, security officers, alarms, cameras, and personal screening practices, as well as computer systems. And it should be tailored to provide the level of protection necessary at a reasonable cost.

Paul Crawford is director of new business applications for the federal systems operation of HFSI in McLean, VA.
COPYRIGHT 1992 American Society for Industrial Security
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1992 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Computer Security
Author:Crawford, Paul
Publication:Security Management
Date:Feb 1, 1992
Previous Article:Artful protection from down under.
Next Article:OPSEC: not for government use only.

Related Articles
Here today, here tomorrow.
Members only.
A card access education.
Signing up for security 101.
For your eyes only.
Electronic locks finding the right fit.
When security means business.
Store Bought Security.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters