Printer Friendly

Maintaining safety margins: general aviation's accident record remains dismally static because we tend to lose focus on important details and allow safety margins to erode.

Students of introductory biology learn a basic lesson about sensory perception in a quirky behavior found in certain amphibians that has become common lore. By now we all know that if a frog is placed in hot water, he will immediately jump out to safety. However, if the frog is placed in cool water that is gently heated to boiling, the frog does not perceive the gradual rise in temperature or the impending danger.

Likewise, when a pilot is presented with a problem or emergency that is an obvious attention-getter, he or she will react quickly to solve the immediate threat--a frog leaping from scalding water. But like the doomed frog sitting patiently in water growing imperceptibly ever warmer, we pilots often miss the cues to the more insidious danger of a mortal threat that results from an accumulation of smaller, less noticeable problems.


We suffer from what is known in the manufacturing world as "process drift" as we accumulate hours behind the yoke. Process drift for pilots is the precise equivalent of that water coming to a slow boil for our hapless frog, now dead and boiled. Coming out of our initial training, we are focused on procedures and safety, but our attention wanes and we become complacent as we build time. Our human nature focuses our attention on novelty; we lose interest in the familiar, making it ever more difficult to take safety seriously as we become more comfortable with and gain proficiency in the airplane. As a consequence, experienced pilots have been known to fly a perfectly good airplane into the ground, venture into a thunderstorm and come out the other side in pieces, dip below minimums on an instrument approach and fall short of the runway, continue on in severe icing only to drop out of the sky, and attempt to run an engine on air rather than fuel. We hear of pilots stubbornly filling our accident reports with incidents of taking off with and then reacting badly to simple discrepancies like leaving a door open, a fuel cap unsecured, the trim run to the stop, or with primary flight instruments in the panel placarded inoperative. All of these are the result of process drift, a slow, imperceptible degradation in the quality of our procedures and operations.



Manufacturers address this issue of process drift by looking for "root causes." In understanding why processes are drifting, procedures can be implemented to prevent such drift. We need to do the same in general aviation. In aviation, at least four root causes of process drift and its multiple consequences are obvious: complacency, false confidence, atrophied skills and avionics envy. Let's look quickly at each.

Complacency arises from the inertia of past success and from fortuitously positive results from previous indiscretions. "I've done this a hundred times before." With each successful outcome, we become ever more insensitive to any brewing dangers, willingly dismissing warning signs upon the weighty evidence of favorable outcomes.False confidence results from our inability to distinguish between dumb luck and skill, usually dismissing the former and claiming the latter. "I've gone through thunderstorms before and came out just fine. I don't see what the big deal is. I know what I'm doing."

Skill atrophy is an inevitable consequence of having a human brain. All of us suffer from a deterioration of unused skills with time--the use-it-or-lose-it phenomenon. Numerous studies in ergonomics from the worlds of commercial and military aviation have demonstrated beyond doubt that skill retention stays steady for awhile, but plummets precipitously after about six months. This is particularly true for procedures we rarely use in routine flying, such as responding to emergencies we usually encounter only in training. Consider then that most GA pilots train only once a year; any skill set for reacting to an emergency is seriously and increasingly compromised as time marches on between training sessions. If the six-month rule is valid, most GA pilots are ill-prepared to deal with emergencies something like half the time.

Avionics envy compels us to put new gadgets in the panel we are poorly prepared to use beyond their most basic functions. I recently heard an experienced pilot flying a turbine airplane ask in a public forum how he could insert a waypoint into a flight plan on the Garmin 530W. If he does not know how to do that, he will certainly be unable to use more advanced functions like flying a DME arc, activating a particular leg on a flight plan, or distinguishing between the various GPS approaches with and without glideslopes. How about the nuances of using the OBS and SUSPEND options during an ILS, hold, procedure turn or missed approach?


We also see process drift in weather avoidance. With time and experience, we start shaving our margins, getting closer and closer to those cells as we cross a line of thunderstorms, or take greater risks in crosswinds, pushing the edge of the aircraft's and pilot's capabilities, or hang out in ice for just a bit longer before taking evasive maneuvers, and flying in heavy rain down the glides-lope of an ILS to minimums. Some of that is perfectly acceptable as we improve our capabilities with time and experience. But with process drift, we can cross the line from prudent gains in experience to dangerous degradation of safety, where our actions exceed our capabilities.

Process drift can be seen in evermore cursory preflight inspections, and a greater willingness to accept discrepancies before grounding the aircraft. We are--each and every one of us--guilty of this to some degree. This is simply another expression of human nature. Some resist better than others. But we are all vulnerable to the idea, that, "Hey, if everything was working on the last flight, why bother checking?"

The NTSB's accident reports are full of commercial and GA accidents that could have been easily prevented with a more thorough preflight inspection and proper use of a pre-departure checklist. For example, Delta Air Lines Flight 1141, a Boeing 727, crashed on takeoff when the crew failed to properly set the flaps and slats. A de Havilland Dash-4 crashed after liftoff because the pilot failed to remove the gust lock.

A Cessna 210 landed gear-up because the nose gear was jammed by a tow bar the pilot neglected to remove during the preflight. This pilot might well say, "We've all heard the adage that there are pilots who have landed gear up and those that will." And I say, "Nonsense." There is no reason you should ever land gearup. I am not claiming I never will; but if I ever do, the cause will be my failure to control process drift. In that case, the drift would be away from the rigorous use of checklists and fidelity to proper procedures. We are better off to ignore the melodic "kick the tires and light the fires," and instead remember we would "rather be on the ground wishing we were in the air than in the air wishing we were on the ground." As tired and old as that advice may be, we still see the idea ignored when pilots take off before the plane is in a condition or properly configured to leave the ground.

We also see process drift in how we manage equipment wear and tear; over time, aircraft systems and avionics need tender loving care to remain within spec. But without a diligent maintenance program, we can allow our aircraft's cables to get saggy, the pressurization system to leak, corrosion to spread, autopilot to oscillate off the desired course or altitude, flaps to wiggle just a bit too much, oil to leak a bit more than it should, or struts to lose some clearance. These types of slow degradation are often insidious, just like that water getting ever, but imperceptibly, warmer.



The best cure for process drift is to acknowledge the phenomenon. We cannot fight complacency if we are unaware of being complacent. We will not be motivated to step up our maintenance program if we do not recognize the process of slow degradation. We will make excuses to avoid training twice a year if we ignore the compelling evidence that not doing so is dangerous.

So what can we do to prevent process drift and tackle the root causes of aviation accidents? Jeffrey Liker, a former executive at Toyota, explained that the auto manufacturer bore down to their problems' root causes with "a very 'sophisticated' technique; it is called five-why. We ask why five times." What that means is that few problems are more than five degrees of separation from the problem initially discovered and that problem's ultimate cause. This principle of five-whys can be broadly applied to discover root causes in almost any circumstance in virtually any industry, including aviation.

Take an alternator failure. Why did it fail? The belt came loose. Why? The mounting bracket broke. Why? Too much engine vibration. Why? The mounting bracket pads are worn and need to be replaced. Why? Because the maintenance program was not designed to catch wear of the mounting pads.

By being vigilant against the insidious drift regularly robbing us of our safety margin, we can mitigate danger, manage risk properly and be better and safer pilots. We do so by making a concerted and conscious effort to avoid a fate similar to that poor clueless boiled frog. Beware of slow creep. And train twice yearly.

RELATED ARTICLE: Systems Knowledge

Process drift is seen in how we approach initial and recurrent training on aircraft systems. I would be able to afford a Phenom 100 if I had a dime for every time I heard a pilot say, "I don't need to know that." Yes, in general, we do. Knowledge is power.

Perhaps if the crew of Aeroperu Flight 603 had a better grasp of the pitot-static system and its various failure modes, their Boeing 757 may not have crashed because a maintenance worker had left some tape covering the static ports. In the case of Birgenair Flight 301, the pilot noticed his airspeed indicator was faulty on the takeoff roll, but proceeded anyway. The copilot's indicator was working properly. With a blocked pitot tube, the captain's ASI was acting like an altimeter; as the airplane climbed, the airspeed increased. The autopilot, and then the pilots, reacted by raising the nose and reducing power to prevent an overspeed, resulting in a fatal stall/spin.




In our initial training, we learned about the power plant and electrical, environmental, hydraulic, fuel and pressurization systems; most of us promptly forgot the details before the ink was dry on our ticket. When we go back for our annual training, the subjects are deemed too boring and irrelevant. Big mistake. Nothing helps more in an emergency than understanding the failing systems. If you have an electric fire and need to shut down all electric power, it would be good to know what will be working and what won't, and how long battery power will last for critical functions. With smoke in the cabin, understanding the pressurization system will help clear the air.

Such an understanding of systems can make the difference between survival, resulting from an appropriate and timely corrective action based on a deep knowledge of what can go wrong, and a smoking hole in the ground caused by responding inappropriately or too slowly. Or too quickly, going back to the old adage that the first action in any emergency is to wind your watch. That old saw is meant to ensure that you do not act without first thinking, causing more harm than good.

RELATED ARTILCE: Glass-Complacent

One of the more disappointing developments in general aviation in recent years has been the relatively unchanged accident rate involving technically advanced aircraft, TAA. According to the FAA, a TAA is one equipped with at least a GPS navigator, multifunction display and an autopilot. In 2010, the NTSB said a review of accidents involving light aircraft equipped with glass cockpits suggests "the introduction of glass cockpits has not resulted in a measurable improvement in safety when compared to similar aircraft with conventional instruments. The analyses conducted during the study identified safety issues in two areas:

* The need for pilots to have sufficient equipment-specific knowledge and proficiency to safely operate aircraft equipped with glass cockpit avionics.

* The need to capture maintenance and operational information in order to assess the reliability of glass cockpit avionics in light aircraft."

Meanwhile, the FAA's Risk Management Handbook, FAA-H-8083-2, says, "It is important to remember that [glass panels] do not replace basic flight knowledge and skills. They are a tool for improving flight safety. Risk increases when the pilot believes the gadgets compensate for lack of skill and knowledge. ...

"For the GA pilot transitioning to automated systems, it is helpful to note that all human activity involving technical devices entails some element of risk. Knowledge, experience, and flight requirements tilt the odds in favor of safe and successful flights. The advanced avionics aircraft offers many new capabilities and simplifies the basic flying tasks, but only if the pilot is properly trained and all the equipment is working properly."


Jeff Schweitzer PH.D recently retired from his position as editor of the MMOPA Magazine;he flies a jet-Prop and has just hit 5000 hours total time.
COPYRIGHT 2011 Belvoir Media Group, LLC
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2011 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:RISK MANAGEMENT
Author:Schweitzer, Jeff
Publication:Aviation Safety
Date:Aug 1, 2011
Previous Article:Using a flight director: once reserved only for turbine-powered equipment, flight directors also can be found in most glass-panel installations....
Next Article:Dissecting the hold: despite currency requirements, for-real holds are too rare for us to stay proficient. Understanding them is easy if we break...

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters