Printer Friendly

Human error vs. design error: accidents often are blamed - incorrectly - on human error. What appears to be inattentive or reckless behavior might actually be a predictable response that product and premises designers should take into account.

An experienced hospital nurse selects a Lasix vial from a cart containing several medications. After checking the label three separate times, she fills a syringe and injects the patient, who subsequently dies. The vial actually contained potassium chloride (KCl), as the label clearly stated. The nurse cannot explain the lapse.

A teenager goes to a nearby lake, where he walks on a pier. He passes two "No Diving" signs and reaches a point where "No Diving" is stenciled on the planks at his feet. He dives into the water, strikes his head on the lake bottom, and suffers spinal cord injuries that leave him paraplegic. He does not remember seeing the signs.

Accidents like these have two possible causes: human error and design error. The accident may have occurred because the actor, or "user," (1) was inattentive, careless, or reckless or because the product's design was unsafe and the user's behavior merely revealed the flaw.

To determine what caused an accident, many people will ask whether the user "could have done otherwise." If circumstances or poor design limited the user's actions, then he or she is blameless. If the user failed to prevent the mishap through possible action, then the cause is human error.

People often interpret the "could have done otherwise" criterion literally, as meaning that the user should always perform the correct action if it is physically possible. An important distinction exists, however, between possible behavior and likely behavior.

Unconscious processes control most of our everyday behavior, and people typically act without conscious deliberation or consideration of risk. (2) People often cannot "do otherwise" for mental reasons that are just as compelling as physical ones. However, product design is sometimes based on an unrealistic, idealized notion of perfect human behavior. A design that requires possible, but unlikely, behavior is just as defective as a design that has electrical or mechanical faults.

Taking the examples above, most people would say that the nurse was inattentive and the teenager was reckless. However, both of these assessments are wrong. Their behavior might be better described as skilled and adaptive.

Carelessness or skilled behavior?

A large body of scientific research shows that designers can predict user behavior with a reasonable degree of accuracy. (3) However, faulty usability design is often missed, in part because such flaws are so pervasive (4) and in part became most people don't have training in human factors, a discipline that applies scientific knowledge of perception, cognition, and response to product design.

Most people--jurors included--judge behavior based on commonsense "folk psychology" and introspection ("What would I have done?") rather than on scientific analysis. However, common sense often rests on several well-documented cognitive biases. (5)

One is "fundamental attribution error," (6) the strong tendency to attribute causation to "dispositional" factors inside the person (such as laziness or carelessness) rather than to environmental circumstances. Another is "hindsight bias," the mistaken belief that an outcome could have been readily predicted before the event occurred. (7)

Hindsight bias is strengthened by "counterfactual thinking," the creation of what-if scenarios. People will usually place causation on the mutable (changeable) aspects of an event. If counterfactual thinking suggests that the user could have changed the outcome by different behavior, then he or she must be the cause. (In contrast, few would say that gravity caused a fall. Gravity is immutable.)

These cognitive biases make for a much simpler world. If the nurse's inattention caused the accident, for example, then the problem could be easily fixed by firing her; if other circumstances were the cause, then the entire system of dispensing drugs might require review. (8) The biases also make the solution cheaper because there is then no need to hire additional nursing staff.

In contrast, a scientific analysis examines three sets of factors controlling behavior in the situation: physical circumstances; general human abilities, limitations, and predispositions; and individual experiences, expectations, and goals.

The role of physical circumstances is usually obvious. A driver at night might experience dim lighting or short braking time. If physical constraints prevent correct action, then the user could not have done otherwise and is blameless.

Behavioral limits are less obvious. All humans are genetically endowed with abilities and limitations that fall into three categories: perception (input), cognition (processing), and response (output). People can only attend to so much (perception), remember so much accurately (cognition), or act so quickly (response).

Human predispositions have evolved to help people cope efficiently with a complex world. One example is "stimulus-response compatibility," an innate connection between perception and response.

Humans have evolved behavioral predispositions in order to act successfully and efficiently. They are especially important in stressful situations, when users perceptually narrow their information intake. (9) However, efficiency and accuracy sometimes trade off; predispositions provide a "quick and dirty answer," which is usually adequate but may fail in unusual situations.

For example, a driver who sees another car approaching on a collision course from the left will automatically steer to the right. The natural tendency to move away from objects heading toward us is fortunate because the conscious analysis of a projectile's path could cause a fatal delay. In some cases, however, the rightward steer actually moves the car into the path of the crossing vehicle. Could a driver do otherwise and steer left to avoid the collision? The answer is "yes" in the literal sense but "no" in a more realistic, behavioral sense.

In addition, humans have many purely mental predispositions, called "heuristics" and "biases," that also operate to quickly guide behavior, reduce cognitive complexity, and conserve mental resources. Fundamental attribution error and hindsight biases are examples, but there are many others that figure prominently in accidents.

One is "cue generalization," the switching from a difficult, attention-consuming visual cue to a simpler one. Recognizing a color or shape, for example, is a much simpler perceptual task than reading. Another powerful predisposition is "confirmation bias," the tendency to seek information that confirms already-held beliefs and tune out contradictory evidence.

Perhaps the most important predisposition is the creation of expectations based on experience. People would be virtually paralyzed if they had to stop and consciously analyze every detail of every situation before acting. Instead, we use experience to develop "schemata" and "scripts"--basic knowledge units that create expectations about how products work, what behavior is appropriate, and what effect each behavior will have. This knowledge need not be acquired through perception at the time of product use, so behavior is faster, smoother, and more efficient.

Users learn to focus attention on the important elements of a situation and ignore its irrelevant or unvarying aspects. They become "inattentionally blind" (10) to information sources that do not affect the achievement of their goal. In cognitive psychology, we would say that they have switched from conscious "controlled" behavior to unconscious "automatic" behavior.

Most routine behavior is automatic. People are seldom consciously aware, for example, of how they steer their cars or walk down stairs. These are complex tasks that only seem effortless because we are so skilled at them. Our reliable, highly learned schemata almost eliminate the need to rely on conscious control.

In fact, this is what we often mean by skill: The user has learned to substitute automatic behavior for the slow and consciously controlled behavior typical of beginners. Asking a skilled user to explain his or her automatic behavior is like asking a dropped ball why it fell. It is impossible to verbalize the real causes and scientific principles at work.

Skilled, automatic behavior may fail under unusual circumstances. Since a person pays minimal attention to irrelevant and unvarying aspects of routine tasks, he or she may not notice a change when the situation isn't normal.

For example, the nurse who administered the KCl probably scrutinized labels carefully when she first began her nursing career. Reading is a relatively difficult task requiring close attention and higher cognitive processes. After performing the routine repeatedly without incident, the nurse unconsciously learned that the Lasix vials were always in a particular cart location; the label name, packaging, and location were redundant selection cues. She started selecting vials according to their location in the cart (cue generalization), which was simpler and more efficient because it required less mental processing.

On the day of the accident, the positions became scrambled, and the KCl was located in the Lasix's normal place. The nurse was merely following an unconscious, efficient schema that had always worked. Her adaptation was skilled behavior and not inattention or carelessness. Skilled users "don't solve problems and don't make decisions; they do what normally works." (11)

In another case, an infant received a fatal overdose of a drug after a pharmacist checked a reference but misread an extra zero in the dosage instructions. The pharmacist was a substitute whose previous dispensing experience was with adults, so she unconsciously expected to see a larger, adult-sized dosage. Further, she had consulted a second reference book and made exactly the same error. While seeing may be believing, it is equally true that believing is seeing.

These stories have several morals.

People can sometimes do otherwise in the physical sense, but not in the psychological sense. The nurse, for example, automatically adapted by shifting from reading text to responding to the simpler location cue. Once she "correctly" selected the vial by location, she would have engaged in "confirmation bias," a strong expectation based on heavily reinforced experience that causes a person to miss or misperceive the highly obvious. Her two additional checks of the vial's label could merely confirm what she believed was the initial correct selection.

Some may say that this argument is nonsense because the nurse could have read the label and was merely inattentive. I would answer that she could not do otherwise because she was not aware that she was not aware.

Behavior should not be disembodied from its context. Every action occurs in a context that includes the past and the future as well as the present. At the moment of the critical action, the past is reflected in the user's learned expectations and the future in the user's goals.

Both expectations and goals direct attention and determine what should be relevant and what should be ignored. The experience of successfully selecting a medication by cart location creates the expectation that label information can be ignored. A driver whose goal is to turn left at an intersection will look both left and right, but a driver whose goal is to turn right will look only left and ignore the fight.

Statements by people about their automatic behavior usually have little value. Since the controls of automatic behavior largely operate outside of awareness, users cannot consciously think about them. The nurse, for example, could not verbalize the reasons for her behavior.

Bad outcomes do not necessarily imply bad behavior. It is easy to blame the nurse in hindsight--after the outcome is known--and to say that she could have done otherwise. Of course, she didn't know the outcome before acting, so the outcome should not be used to judge her behavior. Since most people do not set out to create accidents, their actions must have seemed reasonable to them at the time. To understand the accident, the first question should then be "Why did their behavior seem reasonable to them at the time?"

Circumstances--or, more precisely, the people who create circumstances--often cause the accident. These people have the opportunity to make the deliberate choice to ensure proper design. The tendency of people to adapt by reducing mental effort (by, for example, switching from reading to other cues) is foreseeable. The design of the hospital's drug-administration system failed to take likely human behavior into account and therefore was unsafe. Safe design must take people for what they are, not what we want them to be.

Many authorities blame human error for almost all accidents, attributing 80 percent to 90 percent to human error. (12) However, these studies use statistical analyses that do not consider whether the user's behavior was reasonable given the relevant design or situation. They focus on the user because he or she was nearest in space and time to the mishap, while the designer is a shadowy figure far from the spotlight.

Users may be blamed for misuse as well. Product designers usually aim for a particular functionality--the car seat is intended to keep the child restrained; the "No Diving" sign is meant to be a warning. User behavior that is not in accord with the designer's intent may be termed product "misuse"--a subtle but direct attempt to make the user the responsible party.

Imagine a designer who intended to create a safe toaster. If it had faulty wiring and electrocuted a user, the manufacturer would doubtless be held responsible because the toaster was poorly designed. Intention is not the criterion for proper design.

In contrast, consider warning signs. If the teenage diver ignored a warning sign, he probably would be blamed for his own injuries because of the assumption that the sign fulfilled its intended function. However, the words on the sign are not intrinsically a warning--they are just words, or perhaps more accurately, just a bunch of lines and squiggles. Whether they will actually function as a warning depends on a large number of design variables. (13)

Most products are capable of fulfilling many functions. The user makes his or her own determination about the product's proper use, a perception partly based on his or her unconscious and automatic mental processes.

Instructions may play a role in novel situations but are unlikely to be significant when the task is routine. If intended use and perceived use do not coincide, the designer's viewpoint has no inherent precedence. The issues are whether the designer created a reasonably usable product, reasonably transmitted the intended use to the user, had reasonable expectations of likely user behavior, and avoided conveying unintended or unwanted functions to the user.

The teenage diver's accident shows how intended and perceived function need not be identical. If his behavior is removed from the context in which it occurred, he probably behaved foolishly, and one might conclude he was just another reckless teenager.

However, the teenager had lived near the lake his entire life, safely dived many times, and seen many other people safely dive off the pier over the years, including many on the fateful day. He had never seen or heard of any diving accidents at the lake or seen the no-diving rule enforced. He had even seen hometown newspaper photographs of people diving from the pier.

From the teenager's perspective, he had to weigh the signs' credibility against a lifetime of direct experience. He didn't even "see" the signs on the pier that day. They were as irrelevant as the feeling of your shoes on the soles of your feet--which you probably hadn't noticed until just now.

In fact, the "No Diving" sign confused real and unreal hazards. The water depth around most of the pier was sufficient to allow safe diving, which is why no previous accidents had occurred. Unfortunately, the teenager dived into the one small place where the water was shallow. The signs "cried wolf," obscuring real risks, overwarning, and destroying their own credibility.

Products may also signal unintended functions. One frequent cause is unintended "affordances," (14) which are innate connections between a product's appearance and its possible uses. People will judge proper use simply by looking at the product because "some product characteristics strongly trigger a particular use action." (15)

For example, in one case, a driver placed papers on his dashboard, causing a windshield reflection that partly obscured his view. His car then struck and killed a pedestrian walking along a road.

The accident occurred because the driver responded to an unintended function suggested by the vehicle's design. The dashboard was a flat horizontal surface that could be used as a table. The driver would never have had the opportunity to make this error if the dashboard had been properly designed, say, with a curved surface. It was completely foreseeable that some drivers would use the dashboard as a table.

Final analysis

This scientific approach to human behavior has several implications for attorneys. First, a user who "could have done otherwise" and prevented an accident is not necessarily at fault. "Could have prevented" does not necessarily imply "should have prevented."

Second, the invisibility of unconscious mental processes leads to gross overestimation of human error and to underestimation of design error.

Third, designs that rely on possible but unlikely behavior are just as defective as designs with improper electrical or mechanical engineering design. They ignore human factors design and rely on unrealistic assumptions about perfect human behavior. Such cases should be examined to see whether the error was predictable and whether the design should have been based on a more realistic assessment of likely behavior.

Last, the defendant's human factors development process should be closely examined. Products are often developed without proper human factors design or evaluation.

Notes

(1.) Defined generically, a "user" is a person attempting to employ a "product," which is any man-made artifact, including consumer goods, warnings, steps, and roadways. A "designer" is the person responsible for the product's form.

(2.) See, e.g., Willem Wagenaar, Risk Taking and Accident Causation, in RISK-TAKING BEHAVIOR 257 (John Yates ed. 1992).

(3.) Specific behavior cannot be predicted perfectly, but it is possible to predict general behavioral tendencies. If this were not true, then human factors design would be impossible. However, user testing may be needed to confirm likely behavior--especially if a product is novel, complex, or critical for safety.

(4.) For some everyday examples of faulty design, see Marc Green, Human Error vs. Design Error (2005), available at www.visualexpert.com/ Resources/humanvsdesignerror.html (last visited May 1, 2006).

(5.) Marc Green, Skewed View: Accident Investigation, OCCUPATIONAL HEALTH & SAFETY CAN., June 2003, at 24.

(6.) Lee Ross, The Intuitive Psychologist and His Shortcomings: Distortion in the Attribution Process, in 10 ADVANCES IN EXPERIMENTAL SOC. PSYCHOL. 173 (Leonard Berkowitz ed. 1977).

(7.) See, e.g., Reid Hastie et al., Juror Judgments in Civil Cases: Hindsight Effects on Judgments of Liability for Punitive Damages, 23 L. & HUM. BEHAV. 597 (1999).

(8.) See, e.g., Marc Green, Nursing Error and Human Nature, 9 J. NURSING L. 37 (2004).

(9.) J. A. Easterbrook, The Effect of Emotion on Cue Utilization and the Organization of Behavior, 66 PSYCHOL. REV., at 183 (1959).

(10.) See generally ARIEN MACK & IRVIN ROCK, INATTENTIONAL BLINDNESS (1998); Marc Green, Let's Not Blame the Victim ... Just Yet: Inattentional Blindness, OCCUPATIONAL HEALTH & SAFETY CAN. Jan./Feb. 2002, at 23.

(11.) HERBERT L. DREYFUS & STUART E. DREYFUS, MIND OVER MACHINE 30 (1986).

(12.) See, e.g., What Causes 90 Percent of All Automobile Accidents?, available at www.visualexpert. com/accidentcause.html (last visited May 1, 2006).

(13.) Marc Green, Why Warnings Fail, OCCUPATIONAL HEALTH & SAFETY, Feb. 2004, at 20.

(14.) JAMES J. GIBSON, THE ECOLOGICAL APPROACH TO VISUAL PERCEPTION (1979).

(15.) Freija Hrosvitha van Duijne, Risk Perception in Product Use (2005) (unpublished Ph.D. dissertation, Delft University of Technology) (on file with author).

MARC GREEN is principal of Visual Expert Human Factors and adjunct professor of ophthalmology at the University of West Virginia medical school.
COPYRIGHT 2006 American Association for Justice
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Green, Marc
Publication:Trial
Date:Jun 1, 2006
Words:3203
Previous Article:Avoiding the confidentiality tax bite: the proceeds of your client's settlement for physical injury are free from income tax, right? Not necessarily,...
Next Article:A fund to protect the civil justice system.
Topics:


Related Articles
True Patient Safety Begins at the Top.
EDITORIAL TRAGIC DISREGARD.
To assume is human.
Wrong-errors bugs: a new class of bug?
Owning up: tests that were not done were reported as normal (1); And tests that were done were not reported at all.
Errata.
Affective feedback from computers and its effect on perceived ability and affect: a test of the computers as social actor hypothesis.
Med-mal suits don't hinder error disclosure, studies find.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters