Printer Friendly

Quality parts are as simple as SPC.

Because improving the quality of the parts we make is essential to reducing in-process inventories--a primary goal of our manufacturing revitalization program--we sought a simple technique to help identify sources of quality problems, help solve those problems, and ensure that they stay solved. That technique is statistical process control (SPC). It's called statistical operator control at Harley-Davidson because it gives the person at the machine a major tool to manage quality.

SPC is being implemented at H-D as a result of our work with the University of Tennessee and visits to a number of US auto plants, including GM's Saginaw Steering Gear.

Saginaw is the source of much of our information. Although we're in the early stages of implementing the program, we've already achieved significant gains in part quality, reduced scrap and rework, lower costs, and increased worker involvement. Specifically, we're using SPC to determine when fixtures or tools require changing, to isolate sources of error in a process, and to quickly verify that a machine is making parts to specification following setup or adjustment.

Here are a few examples of our success.

Case 1. The problem (or opportunity, if you are an optimist) was a weekly scrap rate of 5 to 8 crankcases machined on an Ex-Cell-O pinion-bore machine. It cost us $20,000 annually.

A control chart was started by sampling 3 consecutive parts, 6 times a shift. After a short time, the chart indicated that the operator was adjusting inserts every 2 to 3 pieces, coolant wasn't circulating in the machine because chips were blocking a drain, and more coolant was needed at the point of cut.

We acted by cleaning and flushing out the machine, rerouting the coolant lines, installing a new tooling head and tools, and adding more light so tools could be adjusted more easily.

Continuing the charting showed a significant reduction in piece-to-piece variation and a drop in tool adjustments. Moreover, scrap from the operation over a subsequent 18-week period fell to 8 pieces total. Nonrandom behavior is still present, but is being systematically identified and eliminated.

Case 2. An excessive number of painted cylinder heads were being rejected for blisters in exhaust ports. During assembly the blisters cracked, allowing paint chips to enter either the crank-case or combustion chamber.

The paint department initiated a sampling plan using 100 pieces, which identified an upper-control limit (UCL) at 84.2 percent and a lower -control limit (LCL) at 60.8 percent. Even though these levels were statistically stable, they were obviously unacceptable. The method of hanging the parts was quickly revised, which significantly reduced the reject rate, e.g., UCL dropped to 44.7 percent and the LCL to 17.1 percent. A final method change to remove excess paint after dipping effectively eliminated the problem--UCL is now 3.9 percent, and LCL is zero.

Case 3. Over a 3.5-month period approximately 500 transmission cases were "DMR'd" for a problem with the spot-face and chamfers on a drain. The problem's severity led our manufacturing engineers to use a control chart to understand what was happening in the process. They discovered that a 0.500"-0.520" print dimension actually averaged 0.5077" with an average range of 0.026"; 37 percent of the parts produced by the process were out of specification.

The engineers reconfigured the spot-facing and chamfering tooling as a single operation, which dramatically improved yield. The average for the dimension now is 0.5065", with an average range of 0.002". By modifying the process, only 25 percent of the print tolerance is used vis-a-vis the 330 percent used before; no pieces are out of specification.

Case 4. We suspected that daily setup time on our BostoMatics was excessive. Control charts were plotted from historical data to determine the time required for tool changes. An initial sampling indicated an average of 3.5 hr/day was spent with a UCL of 4.74 hr and an LCL of 2.26 hr. Further, the setup process indicated nonrandom behavior.

We purchased additional arbors, which increased availability of cutters on the job site and improved the setup procedure. The result was a reduction in setup costs of $15,000/yr.

Are you asking "How will I ever get my people to use statistics to run a machine?" We did, too; however, the techniques aren't that mysterious. They can be quickly learned and applied by both management and production people. ABCs of SPC

We use statistics as a way to solve problems on the shop floor. Initially, major reject or quality problems are analyzed using a simple, but effective, Pareto chart that lists problems in decreasing order of severity.

Following that, fishbone (cause-and-effect) charts are used to identify problem sources as either people, material, machine, methods, or time. Each potential source can then be quantified and analyzed using statistics.

The first step in actually applying statistics concerns defining some basic concepts, viz, raw data, histograms, averages, normal (bell-shaped) curves, and range.

Raw data is merely a listing of measurements taken from a part at some point in the machining process. The numbers don't tell much more than that a certain quantity of parts have been checked. If you look carefully, though, it's clear that not every part is the same. Many are, but there's also variation; some are larger than the specification, some smaller. In metalcutting, for example, tool wear, inconsistencies in material composition or hardness, an operator's ability to control the machine, temperature of the workpiece and the work area, and variations in speeds and feeds all can cause variations in part dimensions.

Variation is the basis for statistical analysis. How much a part (or outcome) varies from an average (or specified value) is what's analyzed. You simply measure a sample of parts from a larger population and use mathematics to make accurate projections about the entire group.

A histogram puts raw data into perspective. By placing parts of the same size together, the chart shows their distribution or variation.

With simple calculations, you can determine the average, or midpoint of the distribution, as well as the range, which is the highest value minus the lowest value. By plotting the midpoint and range of the bell curve for a machine or process over time, you can determine the system's capability.

Now you can begin to measure and chart variability by performing short-term and long-term capability studies. Short-term capability is a measure of machine or process variability during a particular hour a day. Long-term capability is the complete range of values expected from a machine over an extended period. It shows performance when all variables are taken into account.

Knowing a machine's capability to produce a part accurately is a key to quality. Unfortunately, many companies, including ours only a few years ago, run capability studies after the fact.

When we started to measure machine capability, for example, we took a large sample of machined parts and thoroughly inspected each one. Usually, the entire lot was completed before we determined a machine's capability. We were merely inspecting quality, not managing it and preventing bad parts from being made. An important breakthrough happened when we visited Saginaw Steering Gear and discovered the mini-capability study.

With this technique, an operator samples 10 parts and records the specified dimension on a simple form. The dimensions are totaled, then divided by 10 to determine the sample mean. Comparing this figure to the blueprint specification quickly tells the operator if the machine is targeted to the middle of the blueprint tolerance.

The next step is determining range. In our mini-capability studies, we multiply the range by two. This factor helps approximate the expected range if a larger, more accurate sample was taken.

After finishing the study, capability ratio must be calculated. It's determined by dividing the capability by the blueprint tolerance. This percentage is used for assigning priorities to process-control measures and improvement actions. Divided and conquer

There are four basic classes of operations defined by a capability ratio (see chart entitled Plant process control guide). A class-A operation is an excellent, or close-tolerance, process because under normal conditions it uses less than 50 percent of the blueprint tolerance. A class-B operation, on the other hand, is considered good because it uses from 51 to 70 percent of the available tolerance. A Class-C operation is worse because it uses from 71 to 90 percent, and a class-D process uses over 91 percent.

With a class-A process, operators are required to check parts on a normal frequency, with minimal audits by floor inspectors.

Class-B operations also require an operator to gage parts on a normal frequency. In addition, a roving inspector sees this operation on a normal schedule. We expect to use some kind of control charting for this process.

A class-C operation requires frequent checks by operators and inspectors, and control charting is a must. Finally, a class-D operation may require 100 percent inspection by an operator, frequent inspector visits, and heavy use of control charts. Charting controls

There are several types of control charts used in SPC. Our Milwaukee and York (PA) plants use the median-and-range chart (commonly called a [X bar] and R chart). It's easy to learn, while providing essentially the same information as more sophisticated techniques.

Watching median-and-range data points on a control chart can flag critical changes in a process. One is called distribution shift, which is a significant move in the median point above or below a long-term midpoint without a change in range values. This shift is typical of tool or fixture wear.

Another is a range change. This indicates increased sample variability and that the process capability is deteriorating, which could be due to machine wear, chip buildup, or bad toolholders. The cause can be pinpointed over time, so the operator can anticipate adjustments.

An essential element in applying statistical techniques is sample size and sampling frequency. Our experience shows 3 to 5 consecutive pieces yield a good approximation of the process. Sample frequency is determined by the equipment's control needs. By using data from a mini-capability study, you can establish an appropriate sampling frequency for a particular process.

We sample class-A operations two times per shift; class-B, two or three times; class-C, four times; and class-D, four to six times. In addition, more rigorous measurements are used on C and D parts. This sampling plan is continued until sufficient data is acquired to establish gaging, toolchange, and control-limit calculations (approximately 125 sample measurements).

SPC is being woven into all phases of our manufacturing process. The key to success is the simplicity of the tools--the mini-capability study and control chart--which permit focusing efforts on real problems. We're convinced that all American manufacturers can (our suppliers must) improve product quality and gain the other benefits that we have realized by using SPC.
COPYRIGHT 1984 Nelson Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1984 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Hutchinson, Ron
Publication:Tooling & Production
Date:Jun 1, 1984
Previous Article:Continuous-dress creep feed - should we be taking it more seriously?
Next Article:Tips on saving deviant workpieces.

Related Articles
SPC for zero defects.
SPC software: ready when you are.
Combine training and SPC for quality improvement.
User-programmable process monitoring now in a low-cost package.
Total quality control: a 'top down, bottom up' approach.
Quality statistical process control at Cherry Textron.
Quality-focused automation.
A manager's guide to gauge R&R.
Jaguar-powered SPC charting.

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters