Printer Friendly

Programming for crisis control.


CRISES OF ALL KINDS - PRODUCT tampering, environmental accidents, terrorism - are becoming more frequent and larger in scope. Large-scale crises are partly the result of unintended and misunderstood interactions between complex technologies, human limitations in thought and action, and limited organizational response mechanisms and planning. Control mechanisms (such as high-speed communications systems) are not currently up to the task of managing the complex technologies and systems they are supposed to oversee. In many cases, the potential for large-scale disaster is built into the very design of complex technologies (Bhopal, Chernobyl, Exxon Valdez, the 1987 stock market crash). (1)

Appropriate technologies help control other complex technologies, although nothing can ever substitute completely for proper human and organizational control mechanisms. In fact, technological controls must be integrated with human and organizational controls, or the control systems themselves can lead to crises. Crisis managers must become familiar with a whole new range of control systems and mechanisms if they are to keep pace with the new and expanding field of crisis management (CM).

The University of Southern California Center for Crisis Management (CCM), founded three years ago, conducts applied research that will lead to new practical tools, concepts, and frameworks for CM. The tools, concepts, and frameworks must be of direct relevance to practicing CM professionals. To attain its objectives, CCM has major corporate sponsors that underwrite its research programs and share directly in the tools that result from those programs. None of the results described in this article would have been possible without the research programs of CCM.

CCM is convinced that personal computers (PCs) have a fundamental role to play in CM. The modern PC can collect basic information pertaining to CM from many persons throughout an organization. It can speed up the computation and comparison of basic CM data between individuals, operating divisions, levels of management, and workers. This article describes a series of computer programs that the author wrote to aid CM professionals and develop the field of CM.

THUS FAR, THREE BASIC TYPES OF programs for CM have been developed: the onion model, the crisis portfolio model, and profile comparison. The first computer software program for CM was the onion model (OM). Essentially, the OM prints out a crisis profile (see Exhibit 1) across four key dimensions that have been determined by CCM's research to be important to a successful program of CM. (2) That is, the OM performs an assessment on each of the four levels or fractors. For each factor, it shows whether the organization is in the safety, question mark, or danger zone. The profile consists of connecting the scores on each factor by a straight line.

The onion model is so named because each of the levels is like a layer of an onion (see Exhibit 2). The outer layers are factors that influence effective CM and whose operation and existence can be readily observed. The inner layers are less observable. Exhibits 3, 4, 5, and 6 show the detailed items, issues, and concerns that make up each of the four factors.

Exhibit 3

Formal CM Actions and Policies Do formal crisis plans exist for at least seven crises? Are the crisis plans integrated? Do formal tracking and early warning systems for potential crises exist? - Do they exist only for a limited set of crisis? - Do they exist for a broad comprehensive set of crises? Do formal probing mechanisms for detecting weaknesses and breakdowns before they occur exist? Are well-tested containment mechanisms in place? Are short-term, well-tested business resumption mechanisms in place? Does the organization have clear plans and policies for short-term business resumption? Does the organization have clear plans and policies for long-term business resumption? Does the organization have mechanisms in place that will allow it to learn from its past crises and to redesign its organization if necessary? Does the organization engage in short-term versus long-term training and simulation exercises for CM? Does the organization have a formal crisis portfolio? - Are the crisis plans comprehensive, or do they merely cover a narrow range of crises? - Does the organization realize that crises can cascade, that is, set off a chain reaction of other crises? Exhibit 4 Outcome and Organizational Structural Variables That Affect CM Organizational structure - Are the functions that must be represented on a crisis management team (CMT) clearly identified? - Have the roles that people will need to assume on a CMT or during a crisis been clearly identified and defined? - Have the CMT members practiced their roles and functions? - Have they worked out any overt sources of conflict between them? - Is there a clearly identified team leader or facilitator? Flexibility - Does the organization have the ability to vary from its standard operating procedures to those that will be required during the heat of a crisis? Work roles - How closely do the normal work roles match those that are required during a crisis? Resource sharing - Is the organization able to assemble quickly the resources it will need during a crisis? - Have the barriers to sharing resources across units been identified and overcome? Information sharing - Is the organization able to share needed information across groups, divisions, and teams? Organization legitimization - Does the CM have clear, strong support from top management? Interorganization cohesion - Does the organization have a history of sharing personnel, commitment, support, and energy across divisions and boundaries and boundaries? Opportune surveillance - Does the organization have early warning systems that warn of impending crises as well as potential opportunities? - Does the organization use information on potential problems and crises creatively to foster innovation? - Does the organization use consumer 800 numbers creatively to establish positive contact with consumers?

Exhibit 5 Faulty Organizational Assumptions and Beliefs The fallacy of size: Our size will protect us. The fallacy of protection and resource abundance: Another entity will come to our rescue or absorb our losses. The fallacy of excellence: Excellent, well-managed companies do not have crises. The fallacy of location: We don't have to worry about terrorism in the United States. The fallacy of immunity or limited vulnerability: Certain crises only happen to others. The fallacy of misplaced social responsibility: Crisis management is someone else's responsibility. The fallacy of unpredictability: It is not possible to prepare for crises since they are unpredictable. The fallacy of cost: Crisis management is not warranted since it costs too much. The fallacy of negativism: Crises are solely negative in their impacts on an organization. The fallacy that the end justifies the means: Business ends justify high risks. The fallacy of discouraging bad news: Employees who bring bad news deserve to be punished. The fallacy of luxury: Crisis management is a luxury. The fallacy of quality: Quality is achieved through control, not assurance. The fallacy of fragmentation: Crises are isolated. The fallacy of reactiveness: It is enough to react to crises once they have happened. The fallacy of experience and overconfidence: The best-prepared organizations are those that have experienced and survived a large number of crises or that have dealt with crises throughout their history. The fallacy of technical and financial quick fixes: It is enough to throw financial and technical quick fixes at crisis management Exhibit 6 Psychological Factors in an Organization That Reduce the Ability to Deal with Complex Reality Self-centeredness or narcissism Defensive mechanisms - denial (the expressed refusal to acknowledge threatening realities) - desavowal (the acknowledgment of threatening realities but with a discreditation of their importance) - fixation (rigid commitment to a particular course of action or attitude) - grandiosity (feeling of omnipotence). - idealization (feeling of omnipotence through the idealization of another person, object, or organization) - intellectualization (the elaborate rationalization of an impulse) - projection (the attribution of unacceptable impulses to others) - repression (the pushing down of threatening or unacceptable impulses into unconsciousness) - splitting (the extreme isolation of different elements, extreme dichotomization, or fragmentation) Fatalism or passivity

Thus, the surface factor, level 1, assesses the state of an organization's crisis plans and mechanisms for all the phases of CM, both preventive (signal detection) and reactive (damage containment, short-term and long-term business recovery, postcrisis learning). The second factor or level assesses the organization's infrastructure for CM. For instance, does a CM team exist? Is it well practiced, rehearsed, and facilitated? The third factor assesses whether the organization's culture possesses damaging beliefs that could undermine even the best-laid surface plans for CM. The fourth level digs down even deeper into the organization's culture to assess whether the company possesses any deep mechanisms of denial, which make a sham of any CM program. For instance, if top executives believe they are immune to any disaster, then they and their organization are more crisis prone. The reason is that crisis-prone organizations have been found by previous research to endorse the items in Exhibits 5 and 6 by a ratio of seven to one over organizations that are prepared for crises. In short, the profile of a crisis-prone organization lies toward the right side of the figure, while that of a crisis-prepared organization falls toward the left side.

The assessments are derived from questionnaires that individuals at various levels and locations of the organization fill out. Thus, the profiles are only as good as the responses. For this reason, each individual filling out the questionnaire is asked to give frank replies to the questions. Also, all responses are held in strict confidence. The names and positions of individuals are not reported - only aggregate results.

This method, of course, does not guarantee that individuals will give honest and frank replies or that the responses will accurately describe the crisis state of their organization. Many individuals at many levels and locations should fill out the questionnaire so the views of different parties can be compared explicitly.

Further, the fact that individuals differ does not mean some are right and others wrong. Instead, they may have access to different data, be making different fundamental assumptions, or have fundamentally different perceptions of the organization.

One of the best ways to use the computer-generated profiles is as the basis of a productive confrontation between different views of an organization. The OM has already been run with the CM teams of several large organizations to determine the consistency between the views of the individual team members.

The software that analyzes the raw questionnaires has been designed so that individuals can, if they wish, respond directly on a Macintosh PC. The computer program scores the individual's assessment of the organization's crisis profile (Exhibit 1). In addition, a printer is available to obtain a hard copy of the person's assessment.

One of the most critical assessments the software performs is a comparison between levels one and four of the onion model. If an individual perceives the performance of his or her organization to be in the safety zone in level 1 but in the danger zone on level 4, the computer explicitly asks, "How can this be possible?" (See Exhibit 7.) If the core of an organizations is "rotten" (that is, if its culture is in danger and hence the company does not take CM seriously or has become lax), then what reason is there to believe that even the best-laid plans will be enacted effectively? While the author is not privy to the data and can only speculate, this situation may have occured in the cases of Bhopal, Chernobyl, and Exxon Valdez.

Further, the overall measure of performance on the onion model is multiplicative. CM performance (Exhibit 7) is the product of how well or poorly the company does on each of the critical factors. Doing well on one factor does not make up for doing poorly on others. In other words, if 1 represents good performance and 0 represents poor, then the equation is 1 X 0 = 0, reflecting poor overall performance. Each factor is a critical link in the strength of the entire CM chain.

THE CRISIS PORTFOLIO IS THE SECond type of crisis management program (after the onion model). One of the first research projects conducted by CCM was reported in an earlier issue of Security Management. (3) Both crises and preventive actions were found to cluster in a small number of families. (See Exhibits 8 and 9.) The result is that instead of having to prepare for every conceivable kind of crisis, a company can limit itself in a rational and systematic way. A rational strategy for every organization is to pick at least one crisis in each family (Exhibit 8). If all the crises in a particular cluster are similar, then by preparing for one crisis a company prepares somewhat for all. Likewise, if crisis clusters are fundamentally dissimilar, then picking at least one crisis per cluster confers some degree of overall preparation for all crises.

Any crisis in Exhibit 8 can act as both the cause and effect of any other crisis. In today's world, all crises are increasingly interconnected.

The software routines that compose the crisis portfolios assess the organization's recent crisis history (how many incidents of each crisis have occurred over the last three years) and the organization's level of preparation for each crisis composing the families in Exhibit 8. Other crisis portfolio routines assess how many of the general crisis prevention activities (Exhibit 9) the organization has adopted over the last three years.

As with the onion model, crisis profiles are printed out (Exhibit 10) so that persons in the organization can easily see whether they are in the safety, question mark, or danger zones. As before, the assessment of the crisis potential of an organization is so important that it cannot be left to the determination of a single person no matter how important or well placed he or she may be within the organization.

PROFILE COMPARISONS ARE THE basis of the third type of CM computer program. As said before, it is important that many individuals respond to the basic questionnaires that are used to determine the various profiles. As a result, the author has also developed software that enables one to compare explicitly how similar or dissimilar profiles are from one another. The software for these comparisons determines whether the distances between profiles are statistically significant. The software also determines which profiles are close enough together to be lumped into the same group. With this program, profiles can be explicitly and systematically compared.

Natural disasters will always be an important feature of nature's plan. However, especially in the latter half of the 20th century, a new type of crisis faces civilization: man-made crises that are the result of complex technologies operating with imperfect, human control systems. The magnitude and frequency of such crises is no longer acceptable. Earthquakes, as acts of God, are unpreventable; the public is far less forgiving of individuals and institutions responsible for human-created crises. In principle, the Bhopal, Chernobyl, and Exxon Valdez disasters, to mention only a few, were preventable.

The field of CM is still in its infancy. Better tools, frameworks, concepts, and raw data - and better-designed organizations - are needed to prevent what ought not to have occured in the first place. Computer technology will play an increasingly important role in assessing danger points in organizations' CM programs and in prescribing effective treatment.

About the Author . . . Ian I. Mitroff is the Harold Quinton Distinguished Professor of Business Policy at the School of Business Administration of the University of Southern California in Los Angeles. (1) Charles Perrow, Normal Accidents (New York: Basic Books, 1984). (2) Ian I. Mitroff and others, "Do (Some) Organizations Cause Their Own Crises?" Industrial Crises Quarterly, 1989, in press. (3) Ian I. Mitroff, Terry Pauchant, and Paul Shrivastava, "Crisis, Disaster, Catastrophe: Are you Ready?" Security Management, February 1989, pp. 101-108.
COPYRIGHT 1989 American Society for Industrial Security
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1989 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Mitroff, Ian I.
Publication:Security Management
Date:Oct 1, 1989
Previous Article:Air terrorism: flight or fright.
Next Article:Motivating alcoholic workers to seek help.

Related Articles
Crisis, disaster, catastrophe: are you ready?
How to manage a crisis by planning ahead.
Crisis management.
Putting out forest fires.
Selective balance and crisis management.
What's a crisis, anyway?
Critical incident management in the ultimate crisis.
When You Are the Headline: A Guide to Understanding Crisis Management.
Corporate governance and crisis management. (Crisis Management).
A blessing in disguise? A crisis in an organization can be a positive trigger that propels a company forward while strengthening its image. Deal with...

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters