Printer Friendly

The AI factory; how artificial intelligence will create 'smart plants.' (Cover Story)

Artificial intelligence, expert systems, neural networks and fuzzy logic may sound like exotic concepts taht belong in some "Star Trek" futureworld of manufacturing. But in truth, they are here-and-now realities in the form of affordable, off-the-shelf electronic components and standard software modules readily available to designers of manufacturing controls and information systems. These exotic-sounding artificial-intelligence, or "AI," technologies are already appearing in a variety of discrete applications within plastics processing, from machine controls to troubleshooting advisors to CAE programs (see PT, MArch '91, p. 15).

The next step, which could be one of the most radical developments of the next 10 years, will be to create "artificially intelligent" networks that embrace whole factories, linking together every department of plant operations, and even multiple plants to their corporate headquarters (see Fig. 1). One of the key developments making that possible is the existence of a variety of very high-speed data networking technologies--again, in the form of industry-standard products, readily available and able to interconnect almost any intelligent device and software package. This article is intended to give a foretaste of the kinds of advances that plant-scale AI implementation will bring about. Some of what is presented below is still in the conceptual stages, some is in active development, and some is already here.


Figure 2 presents the traditional "pyramid" concept of computer-integrated manufacturing (CIM). It is a vertical, hierarchical structure, reflecting older management and corporate structures. Those structures imply overhead, complexity, and potential incompatibility between layers. What's more, data flow up and down the pyramid can be slow and inefficient.

The new model for the intelligent factory of the future is the horizontal or "one-layer" peer-to-peer network, also shown in Fig. 2. In this case, all "nodes" on the network ring are compatible equals, or peers, although some nodes may be low-level programmable logic controllers (PLCs) and others corporate mainframes. The key point is that they all share information equally, via instant access to the network, rather than handing it down through various levels. Not incidentally, this is a much lower-cost information "architecture" than the old pyramid structure.

Figure 2 refers to distributed databases and "knowledge bases." The latter will be common features of the AI factory, in which not merely data are shared around the network, but entire "expert systems." Both databases and knowledge bases are becoming too large to be stored at every computer terminal or processing-machine controller that could make use of them. Though physically located at specific computer nodes on the network, databases and knowledge bases can be made transparently available to all other nodes.


To briefly review some of the key technologies that make AI work, we would begin with expert systems. As shown in Fig. 3, these attempt to duplicate the reasoning of a plant's most experienced workers by systematizing their accumulated knowledge into a tree-like structure of "If-Then" logic, following empirically based rules. The figure suggests--in greatly abbreviated form--how the rules might work in the case of solving a short-shot problem when molding PVC with a 10-in. injection stroke.

The next step toward adding power and sophistication to expert systems is use of neural-network technology (Fig. 4). Based on a model of how neurons interact int he brain, neural nets use the speed-enhancing power of parallel processing. Instead of the fixed branching patterns of a more traditional logic path (as in Fig. 3), each "node" or step in the logic path has a multitude of connections to other nodes with a weighting factor applied to each link, related to the probability that it is the correct logic path for information reaching the "upstream" node. Rather than proceed step by step through a conventional branching logic path, answering one question at a time at each node, all data concerning a problem are presented to the input side of a neural network, and if filters through the various probabilistically weighted paths of the network until an output result is obtained.

A neural net may reach the same conclusion as a conventional branching logic path, but it does so much faster. Also, it can handle more than one problem simultaneously, which is not normally possible with standard logic.

Neural networks can be implemented in the form of hard-coded logic hardware or entirely in software. The latter has the advantage of being able to "learn" new logic patterns. It allows one to add new knowledge to the system, and it can learn or correct the weighting relationships between variables through feedback from process outputs.

For example, it is not far-fetched to envision a neural-net "expert" machine controller that could observe the results of different machine setups and learn by itself which settings to use. A neural-net controller could perform, in effect, an automatic series of Taguchi-style designed experiments. The operator could define the variables of interest and allow the controller to perform experiments in setting different values of those variables. The operator might then enter data on resulting part weights or dimensions, and the controller would learn the relative weighting of variables that affect part quality criteria. It is even conceivable that the controller could perform experiments to learn which are the important variables without the operator defining them beforehand.

Likewise, neural nets may help in current industry efforts to make mold-filling simulations self-correcting or self-optimizing, based on closing the loop with real-time molding data.

A natural adjunct to neural-net expert systems is the use of so-called "fuzzy logic." It has the ability to accept imprecise inputs to arrive at a precise output. Unlike conventional logic, which requires statements in a form such as "equal to, greater than, or less than," fuzzy logic can deal with statements like "a little bit" or "close to." Consider the following illustration of conventional logic:

Fact: This tomato is red.

Knowledge: If a tomato is red, it's ripe.

Conclusion: This tomato is ripe. Now consider an illustration of fuzzy logic:

Fact: This tomato is little red.

Knowledge: If a tomato is red, it's ripe.

Conclusion: This tomato is a little ripe.

For conventional logic, "a little red" is meaningless -- if it isn't precisely red, it might as well be green.

Fuzzy logic can compare different sets of data and look for pairs that bear the closest relationship to each other. It can come up with a "best guess" at the answer to a problem. Combined with neural-net technology, the "fuzzy expert" system could try out that best guess to see how it works, and learn from the result how to get even closer to the desired solution. Using the example proposed above of the self-setting machine controller, a hypothetical fuzzy controller that already "knows" rules for processing one heat-sensitive material, PVC, could be told that a "new" material, acetal, is also heat-sensitive, even though its nominal processing temperature is higher than that of PVC. Using fuzzy logic, the controller could make an "educated guess" at some of the constraints on processing acetal.

Now let's look at how combinations of these AI technologies will affect different areas of plant operations.


As suggested above, one of the most important potential contributions of AI at the shop-floor level could be in automating machine setup. Getting machines up and running properly and keeping the process optimized are key problems for many plants, given the universal shortage of skilled operators and the growing complexity and variety of machine controls.

As indicated in Fig. 5, in the AI plant of the future, the technician might walk up to the machine and download from a remote database a recipe of machine settings generated by a flow-simulation program. This would be the starting point for an AI controller to optimize the process according to specified criteria such as throughput or quality.

Or, with fuzzy logic, the technician would walk up to the machine and specify just the material type and general part geometry -- such as a "nylon 66 gear" -- and that would be enough for the controller to set up the machine for a first shot. The material and part-geometry choices would be selected from a fixed vocabulary that could appear as a menu on a touchstreen; or, with voice-recognition capability, the technician could speak the words into a headset.

After examining the first shot, the operator would speak or enter on the screen a simple evaluation: Good Part, Short, Flashed, or Burnt -- and any of the latter could be further characterized as "A Little" or "A Lot." The fuzzy controller would attempt a correction on the next shot and so forth.

Once the mold is running well, the AI controller would run an automatic Taguchi experiment to determine the process window for maintaining acceptable part quality without over-control. The operator would enter critical part dimensions during the designed experiment (perhaps by means of an electronic caliper gauge), which the computer would compare with tolerance information that had been downloaded along with a CAD drawing of the part from a remote database.

Now that the computer has determined the process window, it can use statistical techniques to detect non-random variations. And if, for example, it determines that the material's viscosity has increased, it "knows" from the preceding Taguchi experiments to raise the barrel temperature 10 degrees.

And while the controller is building up this knowledge base on how to run this particular mold and material on this machine, the information may prove valuable to a technician setting up another machine for a different job elsewhere in the plant. When the technician enters the initial data on material and mold, the controller could automatically query the whole network, looking for any stored base of "experience" in running a similar job. Fuzzy logic would be able to make use of information on similar though not identical tasks.


As suggested in Fig. 6, "intelligent scheduling" will be one of the benefits of AI in areas other than direct machine control. The scheduler will pick not just the first machine available with adequate tonnage and shot size, but will look for the machine with a past history of running that part or similar parts on the fastest cycle, with highest quality and/or at lowest cost.

Unprecedented opportunities for molder profit optimization will become possible by running "What If" scenarios to determine how it is possible to achieve the most cost-effective production. A scheduling expert system would consider alternate run sizes and presses, with or without robotics, based on setup time and cost for the machine, robot, direct labor, and so forth. Material selection could become part of the scenario if there were a choice -- for example, between glass- or mineral-filled material, taking into consideration the probable reject levels with each, ability to use regrind, drying requirements (energy cost), and base material cost.

Real-time manufacturing resource planning (MRP-II) would become possible. Instead of keeping a production-scheduling department busy keying in constant changes in order sizes, materials prices, and shop-floor machine/mold availability data, an expert MRP-II program would automatically update itself by searching different databases, from purchasing and order entry to shop-floor production.

Instead of the current practice of generating a schedule once a day, based on yesterday's information -- a schedule that becomes obsolete with today's first unexpected production interruption -- the real-tome scheduler would be on-line with the production-monitoring computer. Thus, when any machine goes down, the monitoring computer would note the downtime reason entered by the operator and search past records for the average length of time that machine remains down for that reason. Then the production monitor would provide the scheduling computer with an estimated time when that machine would be back up and running. Likewise, the on-line scheduler would plug in new job-completion times in accordance with any change in good parts yiel dper hour that occurred for any reason at any machine.

AI would enhance the effectiveness of the purchasing department by recommending vendors for a given product that had the best record of quality, delivery and price. For any given purchase, those factors could be given different weightings in the neural-net logic system. Similarly, fuzzy logic could help anticipate the performance of a vendor on a "new" product nor previously supplied.

Inventory control could apply AI to examining trends in turnover time for warehouse stocks, balanced against trends in materials pricing. When prices are generally on the upswing, the computer would know to maximize current inventories -- and to do the opposite when materials prices are declining. Fuzzy logic could maintain an overall concept of "Just in Time" while actually letting inventories swell and shrink within certain limits. With the aid of direct computer-to-computer communications with suppliers (known as Electronic Data Interchange, or EDI), the purchasing computer could fire off purchase orders at varying intervals to maximize cost-effectiveness minimum sufficient stocks for production needs.


AI also plays a role in the quality department (Fig. 7). Automated Taguchi experiments, already mentioned, would determine relationships between the process and product. Process interactions -- how one variable effects another -- are often too complex to model mathematically. But AI users empirical historical data to generate an empirical model without a pure math basis.

That's because AI works in a manner more like "intuitive" human intelligence: I know that if X is too low, I add a little more to Y. I don't know exactly how much, but I have a "feel" for how much to start with, and if that's not enough I'll add a little more on the next try. That suggests how it is possible with AI to "close the loop" cost-effectively on literally hundreds of variables in a process, without the prohibitive cost and complexity of attempting to put full PID servo control on everything that happens in a machine.

AI can make possible "plant capability analyses" that include evaluation of the quality performance of all departments, even when not easily quantified. Using fuzzy logic to apply quality evaluation rules to nonmathematical characteristics, it becomes possible to evaluate the quality performance of sales and marketing departments, service departments, secretarial staff, and so forth, and combine these with conventional production quality measures for an overall plant quality asssessment.

AI can sift through masses of data to help managers optimize overall plant quality performance and perhaps suggest changes in operating methods. If, for example, it were determined that most scrap was produced during startups, then managers might seek to lengthen product runs to minimize the number of setups.


Maintenance is one of the most obvious and straightforward applications of AI (Fig. 8). Expert systems can preserve the diagnostic knowledge of a plant's most experienced workers after they leave or retire. An AI maintenance computer can periodically query the central machine monitoring system to examine SPC graphs on various process or machine characteristics. From the shape of the curves, AI could determine the estimated time to failure of a valve, screw motor, or other machine component. The same sort of query could look for non-random trends in machine productivity and report on the probability of screw wear, scale buildup in mold-cooling channels, etc. And, of course, downtime and repair records could be used to generate automatic schedules for maintenance or replacement of components prior to anticipated failure.

Cost-vs.-quality analyses could also be performed with AI to answer such questions as what it actually costs you not to do maintenance at a particular time. Thus, in a scheduling church period, a manager could balance the cost of downtime for regularly scheduled maintenance on a machine against the cost of declining quality that might be expected on that machine if maintenance is foregone for the time being.

The concept of shared databases across a network could have tremendous the future, all the data value in troubleshooting and diagnostics. In the future, all the data needed for maintenance and troubleshooting may be supplied on video disc. The laser disc readers could be located off the shop floor; but when a technician approaches a malfunctioning machine, he could download from a remote computer file a CAD drawing of the electrical or hydraulic system of the machine. AI could highlight for him the component most likely to be causing the problem. Animation could illustrate a machine cycle with valves turning on and off. Bringing in process information from another database could allow the technician to overlay an actual pressure curve on a chart or animation of valve sequencing.

To go a step further, AI could anticipate the data needed by the technician before he asks for it. Suppose on-board, automatic self-diagnostics detect a clamp failure. When the maintenance technician logs onto the machine's CRT terminal, it immediately brings up a clamp diagram that highights the problem's location and displays a message of where to look in the on-line manual for more information. A service record for the machine could also have been extracted ahead of time from a remote database, and be waiting to be accessed in pull-down window.


"I don't know how many times designers have created a part that couldn't be manufactured. . ." Heard that one before? No doubt you'll be hearing it less often in the future, as AI expert software stands guard against common nono's such as insufficient draft angles, too-sharp radii, and gross changes in section thickness. Design optimization programs (Fig. 9) will scrutinize every design element for such violations of sound practice. AI safeguards of this sort are already used in several mold-analysis programs, and will move "upstream" into CAD Tool and part design. These safeguards will grow more sophisticated and powerful--for example, disallowing placement of water lines closer than a specified distance from a cavity surface.

Design for manufacturability will be optimized by total manufacturing cost analyses that will take into account material cost, anticipated scrap costs (including an historical-based prediction of rejects), tool cost, and costs of secondary operations or assembly. For example, a choice between designing a tool with a hot or cold runner would examine the overall impacts of scrap, regrinding cost, cycle time, tool-building cost, mold setup times, labor cost, and cost of robotics, if any. Cost might be expressed as a bar graph that could rise or fall as each choice is made in the design optimization process.

As noted above, manufacturing simulations will play an increasing role in design engineering. Flow, cooling, and shrinkage/warpage analysis programs are already starting to generate recipes of molding conditions and machine setups. This will probably become fairly commonplace. As process simulations become more sophisticated and knowledge bases grow larger, it may even become possible to perform Taguchi designed experiments off-line on the computer, rather than on the molding machine, in order to specify the operating window for quality assurance.


At the highest level of AI implementation (Fig. 10), expert systems will help decide "make-or-buy" questions more quickly and easily. AI will provide rapid, detailed estimates of time required to bring new products to market, taking into account design engineering, process engineering, equipment procurement, startup and testing cyles. Balancing workloads among different plants within a company, gauging the effect on production schedules of taking on fewer larger customers vs. more smaller ones, calculating appropriate machine-hour rates for cost recovery and profit maximization, predicting cash flow and future abilities to finance expansion, and forecasting the right mix of machinery to meet future requirements are all further logical applications.
COPYRIGHT 1991 Gardner Publications, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1991, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Bartholomew, Gary
Publication:Plastics Technology
Article Type:Cover Story
Date:Nov 1, 1991
Previous Article:Newly enhanced plantwide CIM.
Next Article:GE pioneers in thermoforming.

Related Articles
New technologies emerge in medical AI.
Artificial intelligence and natural confusion.
'Artificial intelligence' already taking many forms in plastics processing.
Wizard of Oz: bringing drama to virtual reality.
Artificial intelligence - metaphor or oxymoron?
It Smiles, it Frowns -- it's a Robot!
AI agent application 1.0. (Internet Focus).
Mind-expanding machines: artificial intelligence meets good old-fashioned human thought.

Terms of use | Privacy policy | Copyright © 2018 Farlex, Inc. | Feedback | For webmasters