Printer Friendly

Quality tools are applicable to local government.

Applying a cost-quality approach can provide insights into an organization's processes, generate information on possible problem areas and offer general guidance on solutions.

An increasing number of local governments are adopting Total Quality Management (TQM). Government officials who recognize the importance of meeting customer quality expectations the first time and every time agree that decisions should be based on facts, not opinions. There is uncertainty, however, about the applicability of the associated quality tools to government.

This three-part article explains some of these tools, provides examples of their use, and discusses some of their strengths and weaknesses. Part one examines the use of flow charts in determining the cost of quality, part two discusses control charts and part three considers cause-and-effect diagrams. What is provided here is a general discussion of selected tools; for a more in-depth exploration of quality tools and statistical process control techniques, the reader is referred to articles and books cited in the list of references accompanying this article.

Flow Charts: Costing Quality

A flow chart is a diagram that shows the specific steps in a process. Although flow charts are extremely flexible in the processes they can diagram, a flow chart by itself is limited in what it can do. It can provide information on process stages and completion times and identify problem areas. But without information on management or customer expectations, or additional statistical or cost data, a flow chart does not tell how well the process measures up to quality expectations.

One area where flow charts are used is in determining the cost of quality. The private sector's experience in analyzing the cost of quality has resulted in the development of the $1, $10, $100 rule of thumb. This rule states that prevention of problems costs the company $1. Correcting or reworking a service or product problem before it gets to the customer costs $10. If the customer receives a poor-quality product or service and walks away dissatisfied, however, it costs the company $100; this is the cost of the loss of that customer's future business and of the effects of negative word-of-mouth advertising.

Experience indicates that the $1 and $10 portions of the rule are applicable to government. The City of Madison, Wisconsin, found in a garage project that, for every $1 spent on preventive maintenance, the city saved $7.15 in vehicle down time. As for the $100 portion, neither academics nor practitioners have spent much time examining the implication of providing quality service on public attitudes, and it is uncertain whether the $100 figure accurately reflects such costs for local governments.

The cost-of-quality concept can provide insights into an organization's processes, generate information on possible problem areas and offer general guidance on solutions. The City of Austin, Texas, includes cost of quality as part of its overall TQM training program. The City of Fort Collins, Colorado, currently is establishing a cost-of-quality training program. Both cities view cost-of-quality analysis as an important addition to the tools already available to their quality improvement teams.

Quality costs are usually grouped into four categories: prevention, inspection/appraisal, internal failure and external failure. The City of Austin, Texas, uses two cost categories: conformance, which groups prevention and inspection/appraisal together, and nonconformance, comprising internal and external failure. It also views the $100 portion of the rule of thumb as an opportunity lost and does not yet attempt to place a value on it.

Prevention costs are the costs of activities aimed at preventing errors and of building quality into the service, product or process. The types of activities included in these costs are

* system, service or product planning and design;

* quality monitoring and reporting systems;

* quality education and training;

* functional area training related to the product, service or system;

* administration of quality activities;

* supervisory and coordination activities related to the specific process, service or product;

* surveys or research on customer needs;

* recruitment and selection; and

* performance evaluation.

Inspection/appraisal costs are the costs associated with inspection and other activities aimed at determining the extent to which a service or process meets customer requirements. The activities included in this category are

* service, product or process inspection;

* service, product or process evaluation and measurement;

* purchasing inspection and testing; and

* quality audits and certification of service or product suppliers.
Exhibit 2

 Percent of
Category Cost Total Cost

Training, staff meetings $160.00 6.7%

Steps 1-3 172.14
Step 6 84.32
Prog. maintenance 24.00
Total 280.46 11.7

Internal Failure
Step 4 533.05
Step 5 41.99
Rework time 110.00
Total 685.04 28.7

External Failure
Steps 1-8 843.00
Rework time 421.00
Total 1,264.00 52.9

TOTAL $2,389.50 100.0%

Internal failure costs are those costs associated with the correction of a defective service or product before it reaches the final customer. These costs include

* idle time or down time,

* reinspection time,

* rework and repeat service,

* change orders,

* scrap and waste, and

* investigation of the cause of the defects or error.

External failure costs are those costs associated with receipt of a defective service or product by the final customer. These costs include all the steps needed to complete or correct the process plus the cost of dealing with customer complaints and the impact of customer ill will.

Exhibit 1 illustrates the use of a flow chart to diagram the process for recording journal entries and to analyze the cost of quality. The flow chart shows the steps in the monthly journal entry process and identifies the time and costs associated with each. These data are determined by brainstorming to identify each step and surveying those involved to determine the activities carried out and the time spent on each step. From this information, salary costs can be determined for each step. Summarizing, Exhibit 1 shows that the process takes two days (16 hours) per month. The monthly full-time employee (FTE) staff requirement is 6.2 people. Total monthly salary costs are $843. Also shown is the percentage of the total time that is associated with each step in the process; this provides a quick picture of how staff time is allocated throughout the process.

To the cost information listed in Exhibit 1 must be added any training time, staff-meeting time, system maintenance costs, rework costs, and time spent with customers and others explaining the errors. This information can be obtained by reviewing the time taken to correct mistakes and through collective decisions (brainstorming). For the purposes of this analysis, it is assumed that training and staff meetings amount to $160 per month; the maintenance of the computer program costs $24 per month; internal failure requires rework to correct the errors and discussions with supervisors for a total of $110; and external failure includes the cost of all the original steps plus $421, the cost of rework time; and discussions with supervisors and others.

Based on the flow chart data and the added information, the costs associated with each of the four cost-of-quality categories can be determined. The results are shown in Exhibit 2.

The cost-of-quality analysis shows that prevention costs are small relative to appraisal and internal and external failure costs. The elimination of errors would eliminate the internal and external failure rework costs, saving $531 or 22.2 percent.

The private sector's cost of quality ranges from 20 percent in manufacturing to 40 percent in service industries. The cost of quality for governments has been estimated to range from 50 to 60 percent, but some analysts consider 60 percent too extreme, as much depends upon whether the impact of negative word-of-mouth advertising is included in the calculation.

The purpose of cost-of-quality analysis is to reduce costs by shifting more to training and prevention, a shift requiring a fundamental change in attitude about training. TQM empowers employees and provides them with a set of tools to help improve the efficiency and quality of the process, service or product. As a result, the organization moves from a few highly skilled, highly trained individuals at the top to more broadly skilled and continuously trained individuals at the work-group level. Training costs will remain high as a consequence of maintaining the skill levels of the larger work force.

Another approach to cost-of-quality analysis involves activity-based costing (ABC), a cost accounting procedure being used with considerable success to track the cost of quality in the private sector. The concepts utilized in ABC are finding their way into government accounting practices. (See the article "Using Activity-Based Costing for Efficiency and Quality" in the June 1993 Government Finance Review.) The City of Austin, Texas, is currently revising its chart of accounts to include ABC concepts to enable better tracking of the cost of quality. For a discussion on the cost accounting changes taking place in the private sector, see Relevance Regained by Thomas Johnson and the Ernst & Young Guide to Total Cost Management on the reference list.

Control Charts

Another quality tool which has proven useful in the manufacturing sector is the control chart. A control chart is used to discover the amount of variability there is in a process and whether that variability is random or unique. Random variation is considered normal; unique variation is considered controllable.

Control charts are based on the amount of variations around a mean (average). Upper control limits (UCL) and lower control limits (LCL) are identified above and below the mean using specifically designed statistical tables. The events falling within the limits are viewed as acceptable even though there are variations. Those outside are unique and need additional study. In the manufacturing sector, the upper and lower limits are generally established so that, under normal circumstances, 99.74 percent of all observations or activities fall within these limits.
Exhibit 3

Range Control Chart

 Avg. Response
Fire Time
Station Response Time (minutes) (minutes) Range

A 3.0 3.1 3.4 4.0 3.38 1.00
B 3.2 3.3 3.86 3.9 3.56 .70
C 3.7 3.85 3.9 4.2 3.91 .50
D 3.5 3.87 3.95 3.86 3.80 .45
E 4.1 4.0 4.1 4.5 4.18 .50

 Total 18.83 3.15

 Average 3.77 .63

In conducting a control chart analysis, a two-step process is generally used. The first step is to determine if the system is internally stable and under control by using a range control chart. This is done by calculating and comparing the range of a series of observations; for example, the difference between the fastest and slowest time or highest and lowest output. If the system or process is determined to be under control, that is the ranges fall within the upper and lower control limits established by the statistical tables, then the second step--using the mean control chart--is completed. If the system or process is not under control, then the entire system needs to be examined to determine how to stabilize it. The mean control chart examines the performance of product or process contributors.

The use of the control chart technique is illustrated in Exhibit 3, which analyzes the average response times for fire stations using control charts, using four response times for each station. For the purposes of this article, it is assumed that analysis of the range control chart determined that the response times are under control.

The graph in Exhibit 3 indicates whether the individual stations are performing within or outside the prescribed 99.74 percent performance limits. If a station is outside the limits, examination needs to be undertaken to determine why. In this example, station A is below the LCL (3.4 minutes) and may well provide information, or be a model, on how other stations can improve performance. Station E is just above the UCL (4.1 minutes).

The chart identifies stations worthy of a more detailed examination, helps prioritize corrective action and spots trends. Investigation may show that station E is slower in its response time because it is covering a larger geographic area than the other stations. If the departmental policy is to continue to have all fire stations respond to a fire within 4.1 minutes more than 99 percent of the time, then a new station may be needed in the area covered by station E.

There are a number of points that need to be made about using control charts. First, if the use of control charts is contemplated, consultation with a statistician or statistical process control expert is advisable. Even though the control charts show that a process is statistically stable and all events fall within limits, it does not mean the outcome is satisfactory. Input needs to be obtained from customers and stakeholders on their expectations. A consultant can use this information to help set up the control charts, determine the upper and lower control limits, and interpret the results.

Secondly, although the manufacturing sector sets limits that encompass 99.74 percent of all events, such a target may not be appropriate in all circumstances. It is possible that having 95.4 percent of all events within the limits is more appropriate. Third, control charts work well only when there is general agreement on what the outcome should be. For instance, it is generally accepted that a faster response to fire and emergency calls is better. On the other hand, while developers may want quick action from the planning department, neighbors and neighborhood groups (stakeholders) may want a slower response time.

A team in the City of Austin, Texas, used a range control chart to determine the purchasing system of the Health and Human Services department was unstable. The team made a number of recommendations to stabilize and improve the system. One of the recommendations was the continued use of control charts to monitor cycle time and identify when management intervention was needed. The City of Fort Collins, Colorado, is beginning training on the use of control charts for analyzing areas such as fire department response time, the city sign shop production process and police overtime.

Cause-and-Effect Diagrams

The cause-and-effect diagram is a method for analyzing process dispersion. There are three basic types of diagrams for relating causes and effects in a process: dispersion analysis, production process classification and cause enumeration. In the the following example of a dispersion analysis, a key problem is identified and then the reasons or actions causing the problem are listed. Generic headings, such as methods, people, machine, materials, environment or training, are frequently used to assist in the analysis. The main causes and subcauses of the problem are identified using brainstorming techniques.

Exhibit 4 shows a cause-and-effect diagram developed by "The Action Takers" team of the City of Altamonte Springs, Florida. The team examined ways to remove brush for spring and fall cleanup. The problem was that dumpsters were not being used adequately, thus costing the city extra money for dumping fees. Under the machine category of causes, subcauses identified were a lack of material-reducing equipment, dumpsters not being filled to full capacity and the number of hauling trips. Exhibit 5 shows a root-cause evaluation matrix that is based upon the Exhibit 4 diagram. The matrix analyzes the root causes further, determining the percent of the total problem that each is responsible for and whether each cause is actionable.

Based on the cause-and-effect diagram and the cause evaluation matrix, the team brainstormed solutions. Three solutions were identified: hiring a contractor, in-house rental of a chipper and in-house purchase of a chipper. A cost-benefit analysis of these solutions determined that the purchasing of a chipper would, over a period of five years, save the city $43,429. The purchase of a chipper was included in the 1992-93 budget.

Cause-and-effect diagrams often are most effective when used by teams, because the causes and subcauses are generally identified by brainstorming. Brainstorming results are more complete when larger numbers of people participate. Dispersion analysis has a number of strengths: it helps organize and relate factors, provides a structure for brainstorming and involves everyone on the team. Its major weakness is that it can become very complex and, therefore, requires dedication and patience.


Quality tools are an integral part of the TQM process. They are designed to assist teams and management in identifying ways to improve systems and processes, reduce costs, and identify and meet the needs of customers and stakeholders. This article has discussed several quality tools and their strengths and weaknesses. It also has shown how several quality tools closely associated with the manufacturing environment are being applied in local government.

Flow charts analyzing the cost of quality, control charts and cause-and-effect diagrams can be important additions to the existing tools that municipalities use. Control charts can be used to determine if the performance of a process or system is stable and, if stable, which units or contributors are performing within acceptable limits. Cause-and-effect diagrams provide a structured method for analyzing a set of problems; thus, they help speed the decision-making process and improve the quality of the results.

While the use of flow charts to analyze the cost of quality is extremely practical for teams examining one or two processes, the determination of the cost of quality for a section, department or organization requires a more comprehensive approach. To that end, cost-of-quality concepts are being integrated into cost accounting systems, called activity-based costing (ABC), being developed by the private sector and adopted in some cities. As more municipalities integrate these tools into their management practices, cost-of-quality analysis will become a more common occurrence.



* "The Tools of Quality Part IV: Histogram," Quality Progress, September 1990, pp. 75-78.

* "The Tools of Quality Part V: Check Sheets," Quality Progress, October 1990, pp. 51-56.

* Bohan, George P., "Pinpointing the Real Cost of Quality in a Service Company," National Productivity Review, Summer 1991, pp. 309-317.

* Burr, John T., "The Tools of Quality Part I: Going with the Flow Chart," Quality Progress, June 1990, pp. 64-67.

* Daigh, Robin D., "Financial Implications of a Quality Improvement Process," Topics in Health Care Financing, Spring 1991, pp. 42-52.

* Godfrey, James T. and Pasewark, William R., "Controlling Quality Costs," Management Accounting, March 1988, pp. 48-51.

* Keehley, Pat, "Total Quality Management: Getting Started," Public Management, October 1992, pp. 10-15.

* Novack, Robert A., "How to Calculate the Total Cost of Quality," Distribution, August 1989, pp. 108-110.

* Quevedo, Rene, "Quality, Waste and Value in White-Collar Environments," Quality Progress, January 1991, pp. 33-37.

* Sarazen, J. Stephen, "The Tools of Quality Part II: Cause-and-Effect Diagrams," Quality Progress, July 1990, pp. 59-62.

* Shainin, Peter D., "The Tools of Quality Part III: Control Charts," Quality Progress, August 1990, pp. 79-82.


* Aska, Tetsuichi and Ozeki, Kazuo, eds., Handbook of Quality Tools: The Japanese Approach (Cambridge, MA: Productivity Press, 1988.)

* Ernst & Young, The Ernst & Young Guide to Total Cost Management (New York, NY: John Wiley & Sons, 1992.)

* Hutchins, Gregory B., Introduction To Quality Control, Assurance, and Management (New York, NY: Macmillian Publishing Company, 1991.)

* Johnson, H. Thomas, Relevance Regained: From Top-Down to Bottom-Up Empowerment (New York, NY: The Free Press, 1992.)

* Juran, J.M. and Gryna, Frank M., eds. Juran's Quality Control Handbook, 4th edition (New York, NY: McGraw-Hill Book Co., 1988.)

JAMES J. KLINE is a consultant in Portland, Oregon, whose former government experience includes two years with the State of Oregon Department of Revenue, Local Government Finance Section, and six years in city and county government in Oregon. He is completing work on a Ph.D. in urban studies at Portland State University.
COPYRIGHT 1993 Government Finance Officers Association
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1993 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:adoption of Total Quality Management by local governments
Author:Kline, James J.
Publication:Government Finance Review
Date:Aug 1, 1993
Previous Article:Superfund liability dumped onto local governments.
Next Article:Lessons of receivership: the legacy of Chelsea.

Related Articles
Total quality management in local government.
New directions in budgeting and fiscal planning.
Measuring performance: accountability is not synonymous with accounting.
A blueprint for state and local government budgeting.
Environmental mandates: the impact on local government.
Year 2000 readiness of state and local governments: findings from the GFOA/MBIA Survey.
Best Value and Performance Management: Lessons Learned from the United Kingdom.
The budget analyst: skills, training, and compensation.
Local government environmental advisory boards.
Capital budgets: the building blocks for government infrastructure.

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters