Printer Friendly

Avoid decision making disaster by considering psychological bias.

Avoid Decision Making Disaster by Considering Psychological Bias

In the 1980s corporate America brought glamor and fame to a number of executives like Lee Iacocca and Donald Trump. Some measure of this fame was due to their appearance as brilliant decision makers; and so, by association, other successful managers and dealmakers are viewed as human information processing machines who unerringly sort through mountains of data and quickly arrive at "brilliant" decisions.

The image of these star decision makers makes for great copy because we want our heroes to possess skills superior to those of mere mortals. No doubt successful business people do have superior skills, but decision making infallibility is not one of them. The carefully nurtured, and in some cases expensive, image of these corporate heroes as "rationalist technocrat" is simply not true because the image hides the human reality of decision makers who are often victims of their own psychological biases.

In reality the decisions which make or break managers and their images are complex, sometimes difficult to recognize, and almost never easy to understand. The information surrounding these decisions is unstructured and does not readily lend itself to a strict rationalist approach. Rather, these complex decisions are subject to idiosyncratic interpretation and psychological forces which shape, guide, and inform the decision making process along irrational lines. These psychological forces, moreover, are such a natural part of the decision making landscape that they remain hidden, even to sophisticated and experienced managers. And so, when modern, technocratic decision makers fail to take these psychological forces into account the result can be disastrous.

This paper examines several common psychological biases which influence decision making, especially decisions involving complex, unstructured problems. The intent of this paper is to create awareness; awareness of bias, and awareness that everyone, even skilled managers can fall prey to psychological bias. Several remedies are discussed, but a complete discussion of remedies is misplaced until the subtleties of psychological bias are first understood.

The Problem of Unstructured Decision

Problems

You know, market researchers would run the world

if they could. But I've seen plenty of products that

have done well in test markets end up on their

cans . . . The decisions we're making on new products

right now are very gut feel decisions with

limited market research . . .(1)

The manager's fundamental job is decision making. Managers make decisions that have substantial consequences when formulating a direction for the business. They make decisions that have small consequences when allocating parking spaces. They make routine decisions affecting work attendance or vacations. They make nonroutine decisions when considering acquisitions or the development of new products.

A rationalist approach to decision making requires that managers be consistent, thorough, and systematic. Sometimes being rational is not difficult. For example, in the case of routine decisions, there may be precedent, or well-defined organizational policies which can be relied on to help make a rational choice. Thus, making a high quality decision about work hours or vacations is a straightforward matter. The manager can make rational choices because the problem is structured and readily understood, and there is plenty of unambiguous information available. The manager's job is finding the "right" solution to a problem.

At other times, however, being rational is very difficult. When facing a complex decision such as an acquisition or a nonroutine decision such as a major shift in goals, there may be no precedent or policy to guide the manager. In these decision situations there often is little information available, and the information that is available is incomplete and ambiguous. Moreover, the lack of structure attending complex or nonroutine decisions shifts the object of the manager's concern. Instead of looking for the "right" solution, the manager spends time and effort trying to understand the problem. These unstructured decision problems engage the manager in trying to find the "right" problem to solve. Thus, it is difficult to be rational if one is not sure of the problem that needs solving.

Yet managers do not walk away from complex or nonroutine decisions, which are, after all, the make or break decisions for modern managers. But neither do managers give up trying to be rational. What they do, in an effort to be rational and thorough, is apply heuristics, or decision rules, to complex or nonroutine decisions. The use of decision rules is an effort both to structure the decision (find the right problem) and, once structured, to solve it (find the right solution).

Formal and Informal Heuristics

When dealing with complex and unstructured choice problems managers rely on heuristics or decision rules which help simplify the problem by simplifying the information processing required to deal with the problem. Applying a rule of thumb is an example of a heuristic. Managers generally have access to both formal and informal heuristics. The best known examples of formal heuristics have been developed by the Management Science discipline in which systematic decision rules are consciously formulated. This conscious effort results in decision making models, usually mathematically based, which rigorously develop a rational and systematic approach to decisions.

Management Science techniques such as linear programming, break even analysis, or forecasting can improve enormously managers' capacity to make rational decisions. However, many managers do not use them because they are not aware of them. Thus lack of knowledge contributes to managers' being less than rational in their decision making.

Even decision makers trained in the use of formal heuristics, and this includes almost every graduate of a business school, especially MBAs, may use them inappropriately. They may force-fit rational assumptions on ambiguous criteria, and develop models which "seem" rational because they are so precise, but which aren't rational because the underlying assumptions aren't reasonable.

More common than misusing formal heuristics, however, is the case of managers trained in their use but disinclined to use them at all. That is, managers may not be motivated to use Management Science tools. Why not? Because managers are human beings, and the psychology of human decision making often overrides the inclination to apply formal techniques.

The psychology of human decision making refers to the use of informal heuristics: methods of structuring, understanding, and solving problems which are intuitive or unconscious. Informal heuristics are used all the time, especially for complex or nonroutine problems, because psychologically they are very attractive. They make the decision problem "seem" understandable and solvable. The difficulty with informal heuristics, however, is that they tend to produce bias and error; bias and error that managers are not even aware of.

Examples of Informal Heuristics

Availability

Research has uncovered several kinds of informal heuristics which managers commonly use and which lead to bias and error.

Consider the following:

Which cause of death is more likely in each pair?

1) lung cancer or stomach cancer

2) murder or suicide

These deceptively simple choices illustrate an informal heuristic called Availability (Tversky & Kahneman, 1974). People are inclined to judge the probability of an event by the ease with which information for that event is recalled. That is, an event is judged more likely because its features are readily available in memory. Certainly, frequent events are readily available, but so are highly publicized events, emotional events, and recent events.

Does the evening TV news routinely catalog murders or suicides? Last night's report of a spectacular murder, readily available in memory, might lead us to conclude that murders are more common than suicides. Also, everyone knows about the well advertised relationship between smoking and lung cancer, but does that mean that lung cancer is more common than stomach cancer? In fact, for these examples, the second alternative in each pair is about one and one-half times more likely than the first alternative.

Availability may be a factor in complex or nonroutine decisions in which complete, accurate information is inaccessible, and the decision maker makes a decision based on the exercise of memory and imagination. The important point is that using the intuitively appealing heuristic of availability can lead to bad decisions.

Representativeness

Yet another informal heuristic that decision makers rely on is called Representativeness (Kahneman & Taversky, 1972, 1973). Representativeness is a form of stereotyping which leads us to ignore useful information. Consider the following description (Bazerman, 1990):

Mark is finishing his MB at a prestigious university.

He is very interested in the arts and at one time

considered a career as a musician. Is Mark more

likely to take a job:

a. in the management of the arts?

b. with a management consulting firm?

Many people, according to Bazerman (1990), approach the problem by analyzing the degree to which Mark is representatives of their image of individuals who take jobs in each of the two areas. Consequently they choose "a. in the management of arts." This choice, however, overlooks valuable base-rate information: many more MBAs take jobs in consulting than arts management.

While a large number of people in arts management may fit Mark's description, there are a larger absolute number of management consultants fitting Mark's description simply because of the relative preponderance of MBAs in management consulting. But decision makers are inclined to give great weight to individuating information of questionable value (Mark's interest in the arts) and ignore completely the valuable information about the sample population.

This example suggests that in considering complex or nonroutine decisions, managers may err by automatically overvaluing information which seems to single out or individuate data, even though the information is a stereotype of questionable utility. Interestingly enough Kahneman & Tversky (1972, 1973) have found that people do use base-rate information correctly when no other information is provided. In the absence of a personal description people will choose management consulting as a career for Mark.

Anchor and Adjustment

The misuse of information is a factor in another informal heuristic termed anchor and adjustment (Slovic & Lichtenstein, 1971; Tversky & Kahneman, 1974; Dawes, 1988). Consider the following quote taken from a Fortune magazine article on executive pay:

...you will typically find that about one-third of

major companies aim to pay their CEO's at 75th percentile

levels of compensation...and precisely no

companies say they are aiming for less than the

median. Is it any wonder that pay levels move up

smartly year after year?(2)

When relying on the anchor and adjustment heuristic the decision maker selects a baserate, usually a number, which serves as an anchor for the decision. The decision will then be an adjustment, usually a modest adjustment, to the anchor. The anchor psychologically takes on great importance because it becomes a kind of threshold which helps define the limits of the decision problem. CEO salaries creep up because the anchor-desire to be in the 75th-is high to begin with.

But anchors can be important even if they are not relevant to the decision. For example, a research study (Tversky & Kahneman, 1974) had subjects estimate the percentage of African countries in the United Nations. Before making their estimate, however, they were required to spin a wheel of fortune and say whether the number value that came up on the wheel was too high or too low compared to the percentage of African countries in the U.N. Then subjects gave their own estimate. The arbitrary starting values had a strong influence on people's estimates. The median estimate for subjects given a 10% starting point was 25%, while the median estimate for subjects given a 65% starting point was 45%.

Framing

The informal heuristics-availability, representativeness, anchor and adjustment-are intuitively appealing because they seem to make the decision more manageable, yet, as the examples demonstrate, they can introduce biases that lead to error in making even simple decisions. Consider how much more problematic such bias and error are when managers are grappling with complex decisions or consequential decisions in which psychological bias affects not only a choice, but the very structure of the decision problem as well.

Psychological bias and the structure of decisions is much the central issue in framing, which is a phenomenon that is as simple as it is powerful. In framing, choice, particularly risky choice, will depend on the formulation of the decision problem, the way the problem is presented. The formulation or presentation of data or facts is not supposed to affect the rational decision maker, but it does. Psychologically, different frames of reference carry different meanings and these different meanings can lead to different choices.

There are two factors to consider when dealing with framing (Thaler, 1980; Bazerman, 1984; Tversky & Kahneman, 1986). The first of these has to do with the evaluation of outcomes. Decision makers evaluate negative and positive outcomes differently. Their response to losses is more extreme than their response to gains which suggests, psychologically, the displeasure of a loss is greater than the pleasure of gaining the same amount. Thus, decision makers are inclined to take risks in the face of sure losses, and not take risks in the face of sure gains. Consider the following example (Kahneman & Tversky, 1984):

Imagine that the U.S. is preparing for the outbreak of

an annual disease which is expected to kill 600

people. Two alternative programs to combat the

disease have been proposed. Assume that the exact

scientific estimates of the programs are as follows:

If Program A is adopted, 200 people will be saved.

If Program B is adopted, there is a one-third-probability

that 600 people will be saved and a two-thirds

probability that no people will be saved.

Which of these programs would you favor?

Notice that the expected value is the same regardless of which program is chosen (i.e., 200 lives). However decision makers tend to prefer Program A, the program which, in their view, guarantees saving 200 people. In other words they choose the sure thing. But what if the choices were framed as follows:

If Program C is adopted, 400 people will die.

If Program D is adopted, there is a one-third probability

that nobody will die, and a two-thirds probability

that 600 people will die.

Again notice that mathematically all four options are equivalent, and that options A/C and B/D are the same except that the wording (framing) is different. One set of options is framed in terms of people being "saved", and the other set is framed in terms of people "dying". When presented with options C and D, however, decision makers prefer option D. When the frame is negative, the psychological stimulus is stronger and decision makers are inclined to take a risk. Psychologically their options seem like losing propositions no matter what they do, so decision makers will take a chance on losing nothing (or everything).

Thus, presenting decision options in terms of different frames can have profound implications for decision outcomes. When dealing with what they think is an "opportunity", managers will make different choices than when they think they are dealing with a "threat". The hard data, the facts, can be the same in either case, but the meaning of those facts changes with the frame, and so will the choices.

The second factor which must be considered with framing is that of the reference point. The evaluation of alternatives can depend on the reference point. Consider the following example:

You are spending the afternoon at the racetrack.

You have lost $90 and are considering a $10 bet on

a 10:1 long shot in the last race. Are you going

to bet the long shot? (Bazerman, 1990):

In this example the decision maker could adopt one of two potential reference points. One potential reference point would have the decision maker consider the last race an event independent of all the previous races. A second potential reference point would have the decision maker to see him/ herself as "$90 in the hole". Framing would predict that decision makers who adopt the "$90 in the hole" reference point are more likely to make a risky choice. What this means is that decision makers who do not adjust their reference point can find themselves considering risks they would normally think unacceptable (i.e., betting on a 10:1 long shot). Tversky and Kahneman (1981), in fact, contend that this analysis is supported by popularity of long shots in the last race of the day.

In each case of the horse racing reference point problem a shift in reference point changes the meaning, the interpretation, of what is being decided. The objective facts don't change, but psychologically the decision is "different" because the facts have a different impact depending on one's point of view.

Consider how a reference point factored into Robert Campeau's ill-fated attempt to become a retailing kingpin in the United States. A Fortune magazine article noted: "Campeau ... has spent $13.4 billion ... buying retailers ... and laid on $11.7 billion of debt ... Perhaps the risks seem trivial to Campeau, when compared with the grim circumstances of his childhood ..."(3) If a major reference point in Campeau's life is the hardship of his childhood, then borrowing billions may not seem daunting.

Thus, overall, framing is an important issue because of charges in meaning attributable to changes in reference point, and because of changes in meaning attributable to viewing outcomes either as positive or negative, as we noted earlier.

Sources of Psychological Bias in Work

Organizations

In the world of business psychological bias in decision making is an important issue. For example, when deciding whether or not to sell an asset does the management group consider sunk costs and, after considering sunk costs, decide against the sale because the investment will not be recouped? Rationally, sunk costs are not supposed to be part of the decision. But, psychologically, sunk costs are hard to let go of. Managers who consider sunk costs are using a reference point similar to that in the case of the horse player who sees him/herself "$90 in the hole."

But the sunk cost example is only one of many decision problems in which psychological bias might play a role. Consider how much representativeness might be a factor when an organization is choosing a new CEO, or hiring anyone for that matter. Consider how much auditors or regulatory investigators might be influenced by anchor and adjustment. Consider how much an "intuitive, hunch playing" decision maker falls prey to availability.

Consider the larger frames of reference which exist in all organizations. Organizational decisions are made against a background of history rich in traditions, rituals, and mythologies. A decision maker's interpretation of this history does two things: 1) it provides a context frame which helps create meaning for a given event and; 2) it provides a kind of goal reference for what the organization values. Thus history and cultures provide frames and points of reference, ways to understand and structure decision problems. Although traditions and cultural expectations serve as important anchors for understanding the world around us, these same anchors may also impart bias. And the bias is so value laden, so much a part of what is considered normal and routine, that it is difficult to notice its presence; we're most often unaware of its effects.

To the extent that mythologies, rituals, and traditions serve as frames for decision making in the work place, many of the decisions we believe ourselves to be making in fact are pre-programmed, predetermined by the frame, the reference points that guide and inform our thinking. Thus, it should be no surprise that companies (or governments, or individuals) fall into the same traps, and make the same errors over and over. The same errors are guaranteed unless the frames of the decision are recognized and changed. Consider that organizational change is so very difficult to achieve because the frames of tradition have such a powerful hold on the collective psychology of the organization.

Research Issues

Research examining psychological bias, especially framing effects, has led to two inescapable conclusions: 1) framing effects are powerful; and 2) people generally seem unaware of the presence of psychological bias.

For example, one study (Duchon, Dunegan & Barton, 1989) took place in an international engineering company in which the managers believed quite sincerely that they made decisions based solely "on the data; the cold, hard facts." The researchers proposed to demonstrate that this rational approach to decision making was, in all likelihood, not taking place. They sat down with the firm's top executives and created a decision making scenario typical of the kinds of problems managers faced in the organization.

The researchers created two versions of the scenario which were identical in every fact except in one detail. One version presented the likely outcomes of a choice situation in terms of probability of being "successful" and the other version used the term probability of being "unsuccessful." These scenarios were then distributed to managers and engineers in the company and, as the researchers expected, but to the shock of top management, the difference in one word led to very different perceptions of the decision and very different choices. Clearly "the facts" did not speak for themselves.

Another study looked at different ways frames might manifest themselves. (Dunegan & Duchon, 1990). In a laboratory study the researchers considered how patterns of successful or unsuccessful decision making act as frames, and how these frames may be more or less influential depending on a person's cognitive style, his/her inclination to take risks, or engage in creative thinking. Complex relationships emerged but, essentially, patterns are meaningful to decision makers but the exact meaning can vary systematically from person to person.

Yet another study (Ashmos & McDaniel, 1990) found in a sample of hospital organizations that the hospital's culture created an organization-wide frame that was related to the degree to which managers and doctors participated in decision making. The specific shared meaning created by the hospital's culture had predictable consequences for the organization's strategic choices. Thus, organizations as a whole not only seem to have frames, but also these frames influence both decision processes and choice.

Dealing With Psychological Bias

Framing and other psychological biases such as availability, representativeness, and anchor and adjustment are powerful factors influencing decision making. Although their power is impressive, what makes these psychological factors so important is that decision makers generally are not aware of them. That is, modern managers/dealmakers who we assume to be information processing wizards usually are unaware that "non-data" factors enter in to their decisions. What is worse is that complex, nonroutine, unstructured problems which demand clear thinking and careful consideration are most prone to bias and errors.

The lack of awareness of powerful yet subtle psychological factors introduces an unpredictable, almost random factor in making complex or nonroutine decisions. The psychological biases are present and they are working their mischief, yet the dealmakers not aware of their presence fall prey to misattributions or erroneous assessments of cause and effect. If dealmakers do not understand cause and effect in their decisions then they cannot successfully influence events; they cannot not successfully manage.

But modern decision makers do not have to fall prey to psychological biases. They can take steps to lessen the effects of psychological bias and thus increase their own "success rate" in terms of making good decisions. The first thing decision makers can do is be aware of psychological bias. Awareness of the potential decision making disasters associated with psychological biases can lead to an openness, a receptivity, to changing biased decision processes.

Simple awareness, however, will not be enough to produce changes in decision making unless decision makers also realize that psychological biases can affect everyone, even highly skilled managers and dealmakers, even themselves. Thus, self-realization ("this can happen to me...") must accompany awareness.

Ultimately, however, the decision maker must take steps to uncover psychological bias and counter its effect. Uncovering a bias requires decision makers to consider carefully their underlying assumptions or frames. Such consideration can be systematically undertaken in different ways. For example, gathering more data rather than relying on memory can help overcome Availability. Consciously examining problems in terms of base-rate data can help overcome Representativeness. Using different anchors to examine a problem can help overcome Anchor and Adjustment.

In terms of frames and reference points decision makers can identify their frames and reference points and then ask themselves whether or not other frames are just as reasonable. Sometimes getting other people involved can help expose alternative points of view. Managers can actively seek out contrarian points of view, and cultivate a kind of low-grade conceptual conflict by appointing different people to represent different frames. Then they can listen carefully to what the problem sounds like within these different frames, making sure that everyone understands that this is an exercise necessary to make a good decision, and that the people presenting the chosen frame are not "winners" nor are the discarded frame presenters "losers." Everyone wins when a good decision is made.

Good decision makers try to gather and consider their data carefully. Careful consideration, however, means more than just assessing the "facts". Careful consideration, and good decision making, requires dealing with the biases hidden in the psychology of decision making.

Footnotes

(1) Interview with former RJR Nabisco chairman Ross Johnson.

Quoted from Fortune, July 18, 1988 (Vol. 118, No. 2) p. 35. (2) Crystal, Graef S. (1988). The wacky, wacky world of CEO pay.

Fortune, June 6, Vol. 117, No. 12, 68-78. (3) Ballen, Kate (1988). Campeau is on a shopper's high. Fortune,

Vol. 118, No. 4, August 15, 70-73.

References

Ashmos, D.P. & McDaniel, R. (1990). Patterns of Participation in

strategic decision making: The case of hospitals and physicians.

Paper presented at ORSA-TIMS meeting, Las Vegas. Barton, S.L., Duchon, D., & Dunegan, K.J. (1989). An Empirical Test

of Staw and Ross's Prescriptions for the Management of Escalation

of Commitment Behavior in Organizations. Decision Sciences, 20,

532-544. Bazerman, M.H. (1984) The relevance of Kahneman and Tversky's

concept of framing to organization behavior. Journal of Management,

10, 333-343. Bazerman, M.H. (1990). Judgement in Managerial Decision Making.

(Second Edition). New York: John Wiley and Sons, Inc. Dawes, R.M. (1988). Rational choice in an Uncertain World. New

York: Harcourt Brace Jovanovich. Duchon, D., Dunegan, K.J., & Barton, S.L. (1989). Framing the Problem

and Making Decisions: The Facts Are Not Enough. IEEE Transactions

on Engineering Management, 36, 25-27. Duchon, Dennis, Ashmos, Donde, & Dunegan, K.J. (1989). Risk taking

characteristics of the in-house entrepreneur: A comparison of

innovators and non-innovators. Proceedings 20th Annual Decision

Sciences National meeting, New Orleans. Dunegan, K.J. & Duchon, D. (1990). Decision Making: A Study of

Individual Differences and Context Interactions. Journal of Management

Systems, 1. Hogarth, R.M. (1983). Judgment and Choice: The Psychology of

Decision Making. New York: John Wiley & Sons. Kahneman, D. & Tversky, A. (1972).Subjective probability: A judgment

of representativeness. Cognitive Psychology, 3, 430-454. Kahneman, D. & Tversky, A. (1984). Choices, values, and frames.

American Psychologist, 39, 341-350. Slovic, P. & Lichtenstein, S. (1971). Comparison of Bayesian and

regression approaches in the study of information processing in

judgment. Organizational Behavior and Human Performance, 6,

649-744. Taylor, R.N. (1984). Behavioral Decision Making. Glenview, IL.: Scott,

Foresman & Co. Thaler, R. (1980). Toward a positive theory of consumer choice. Journal

of Economic Behavior and Organization, 1, 39-80. Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty:

Heuristics and biases. Science, 185, 453-463. Tversky, A. & Kahneman, D. (1981). The framing of decisions and the

psychology of choice. Science, 211, 453-463. Tversky, A. & Kahneman, D. (1986). Rational choice and the framing of

decisions. Journal of Business, 59, 251-294. Wright, G. (1984). Behavioral Decision Theory. Beverly Hills, CA.:

Sage Publications.

Dennis Duchon is Assistant Professor of Management at the University of Texas at San Antonio, Texas; Donde Ashmos is also Assistant Professor of Management at the University of Texas; and Kenneth J. Dunegan is at Cleveland State University, Cleveland, Ohio.
COPYRIGHT 1991 St. John's University, College of Business Administration
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1991 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:psychological biases which influence decision making
Author:Duchon, Dennis; Ashmos, Donde; Dunegan, Kenneth J.
Publication:Review of Business
Date:Jun 22, 1991
Words:4609
Previous Article:Interview with Les Slater.
Next Article:The just-in-time philosophy: a legacy of an obsession.
Topics:


Related Articles
Elementary 'psychological accounting.' (research on decision-making)
Determining decision-making effectiveness using NCAA basketball tournament results.
Seeing through expert eyes: ace decision makers may perceive distinctive worlds.
REDUCING PEOPLE'S JUDGMENT BIAS ABOUT THEIR LEVEL OF KNOWLEDGE.
The shortcuts consumers take.
Applying learning principles to development of multimedia for addressing bias in street-level public decision-making.
Appendix 8: psychology of bias.
How stress influences decision-making.

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters