Printer Friendly

Process mapping: why we need a "robust" process mapping system.

On Wikipedia, a map is defined as "a visual representation of an area, a symbolic depiction highlighting relationships between elements of that space such as objects, regions, and themes." "Process mapping," then, is the relation between elements of a process, physical or intangible; in this respect, mapping is a critical activity when we have to analyze a process in order to gain a better process understanding.

Of course, process understanding is much more than a visual representation of connections. We usually do not need the process map per se, but mainly as a starting point for further analysis. In the absence of an accurate and complete representation of the process, sound analysis will be lacking. Consequently, any mistake or weakness in the process representation may jeopardize the entire effort of achieving comprehensive process knowledge.

It is important to appreciate that process mapping is the basis for process modeling or simulation, including risk analysis or statistical simulation. An ideal process mapping activity will have the following attributes:

Robustness: The process can be replicated by different people, with different backgrounds, under different conditions, to yield the same results. This is important to preserve the process knowledge and ensure it is not affected by personal bias or from any other particular condition.

Efficiency: This can be described in terms of how good the communication is between stakeholders despite different backgrounds, and the ability to coordinate efforts according to the complexity of the system and the pre-defined level of information required.


Maintainability: This is the property that a process map must possess, as the process may change. This easily happens when we start to map a process during the development phase. Accommodating change without losing information, redoing work or being forced to perform further analysis are all important when completing the process map. I will revisit these specific aspects later.

In other words, a robust, efficient and maintainable process map helps to understand the system and provide explanations by representing all the process functions and connections. Moreover, it facilitates a systematic approach to complex systems through a structured and objective way of representation. In addition, it allows information sharing. Finally, it allows identification of critical aspects connected to the process, giving a shared representation of all the elements involved at each development stage.

Of course there are many other adjectives that might be used in order to define an ideal tool, but we can start from these three in order to make distinctions among the various process mapping tools that are available.

Process Mapping Tools

Various tools can support process-mapping activities in different ways, with differing levels of detail and complexity. Each one is suitable or has been developed for a specific purpose; none is perfect, but there might be some aspects that should be evaluated more carefully with respect to what we consider an "ideal" tool.

In a more pragmatic context, several tools in the context of risk management process should be compared in order to balance their pros and cons.

One of the more common process mapping tools is the Block diagram. This is a diagram of a system, in which the principal parts or functions are represented by blocks connected with lines that show the relationships among the blocks. The block diagram is typically used for high-level, less detailed description aimed more at understanding the overall concepts and less at the details of implementation. This level of information usually is not enough to start with further analysis.

The Ishikawa diagram (also called a fishbone diagram or cause-and-effect diagram), used in conjunction with the block diagram represents the most common approach for process mapping, where the Ishikawa diagram states some of the relations between process configuration and process output. Ishikawa, in its interpretation as a "cause-effect" diagram, represents the link between function and output, with a causalities-based logic, superseding the logic sequence that has been depicted in the block diagram.

This approach is probably the most used in preparation of any risk management process, and has the greatest advantage of being very easy and not time consuming. Conversely, there are a few drawbacks that can limit this approach to a one-off implementation. Risk management, however, is never a one-off implementation, but rather a process and a continuum for the entire life of the product.


There are two major drawbacks for this approach that might limit the use to a preliminary draft implementation. The first drawback is structural and involves maintainability. Changes in the block diagram due to a change in the process, like an upgrade or fine tuning during further analysis, can be captured in the Ishikawa diagram. Due to their relational nature, however, it is difficult to identify changes that can be logically connected with the change in the diagram. Block and Ishikawa diagrams are related in a static way so it is extremely difficult to see incremental changes or keep track of them.

The second drawback is subtler and not so easy to detect, which can be hazardous. This is related to the robustness of the tool. Whichever criteria will be used to measure lean manufacturing--6 Ms, 8 Ps, 5 Ss--it is not possible (in a mechanistic way) to define whether all the conceivable or relevant relationships have been identified and positioned in the right branch. The real effectiveness largely depends on the team member and that member's level of knowledge. Therefore, it is not easy to reproduce the same results with a different team, because the rationale is largely based on individual knowledge, and this might be difficult to sustain in front of an auditor.

There are other easier and more intuitive mapping tools, such as the flowchart diagram. The flowchart diagram provides a clear schematic representation of a process. Besides the simple data and flow representation, the flowchart diagram can support decision-making activities during the process in a basic yes/no decision. This kind of method is simple and user friendly.


The level of detail can vary from box to box depending on he physical flow in the process mapping. This representation 3 static, and the output is a photograph of the process in a veil-defined time. Each change in the process will determine he update of the whole diagram.

This approach does not present the same level of weakness hat underlines the previous approach; maintainability is higher, but there are still problems with robustness.

There are other families of tools, such as the SIPOC diagram (Supplier, Input, Process, Output, Customer), but in general all of them show the same level of weakness as mentioned above.

These tools, generally known as "value mapping tools," may be used in some context due to their scope of identification of the value chain inside the process. One of the strengths of these tools is their extreme flexibility to allow easy and quick use, mostly based on the information gathered during team working session. One drawback of this flexibility, in respect to the risk management and full process understanding, is the potential to focus on a specific value force to use a cut-off for the information that seems not be related to the "value" object of the analysis; this pre-selection necessarily leaves a lot of potential area unexplored for a complete understanding and, as general comments, none of these processes (as far as I'm aware) allow rules that can prove a self-consistency. Thus, different teams can generate different diagrams from the same general value, but not exactly the same use of basic information.


IDEF Family Tool

Another family of process-mapping tools is found in the class of "modeling languages." Approaching the problem of mapping a process using the tool of process modeling is very advantageous and can be very close to the ideal tool. An example of this kind of tool is the IDEF-0 technique.

IDEF, an abbreviation for Integration Definition, refers to a family of modeling languages originally developed by the U.S. Air Force beginning in the 1980s and commonly used in military as well as business modeling.

The IDFF-0 technique belongs to the IDEF family of tools, which allows one to perform various kinds of mapping. These tools can be used in many different fields, from functional model to data, simulation, object-oriented analysis/design and knowledge acquisition. IDEF-0 creates a "function model," which is a dynamic, structured representation of the functions, activities or processes of the system under analysis. The main characteristics of IDEF-0 are:

* Hierarchical structure: the system representation consists of a hierarchical series of diagrams, text and glossary cross-referenced to each other.

* Graphical description: the model gives a visual representation of the relation existing among input output, and product functions.

* Highly communicative: it uses simple and shared language that is comprehensive and expressive.

* Organized and systematic way of work: it gives an objective and rigorous way to analyze a system.



Thus the IDEF-0 techniques can be used for function modelling activities. It pro ides representation of all the elements involved in the system and has several advantages compared to the previous family of tools described.

The first concerns the building of an evolving map. The level of detail can be either general or detailed, depending on the desired analysis, the objective, and the information available. The IDEF-0 draws an evolving map that can be updated when changes happen and maintained coherently within the system.

The second benefit consists of traceability (in this context, a higher degree of maintainability). Process mapping with IDFF-0 assures the complete traceability of each element identified during the analysis. Each level of detail and each element associated with the level can be directly associated to a process function. This kind of tool becomes a means to share information, update the process and accelerate changes by assuring a realtime update and a diagram that always reflects real system status. This traceability is a value that can be exported to the risk analysis tools that can be implemented. In other words it is always possible to link the IDEF-0 box or representation to the correspondent branch of the FTA (Fault Tree Analysis) or the specific row of an FMEA (Failure Mode Effect Analysis).

These two benefits are embedded in the methodology and fulfil the requirements for an efficient and maintainable tool.

But the main advantage of using tools belonging to a family of modelling languages is internal coherence. This way, it is possible to implement some rules in conjunction with what is definite and which allow one to identify objectively each function (i.e. unit operation) by three levels of knowledge:

1. What is known

2. What is unknown

3. What is believed to be known (a.k.a., the hidden assumption)

This offers a systematic approach to move from a "subjective" to "objective" mapping process and addresses the issue problem of robustness.

Process modeling, as performed by IDEF-0, not only assures representation of the system, but also models the system functions and elements of the organization's activities affected by changes and data increase. The process modeling itself gives the user considerable advantages especially when process complexity must be managed and the system has great variability

Nevertheless, other characteristics need to be considered, because the model can be more powerful. For example, sometimes the process variability and change within a company are connected to production increases, so process modeling has to be scalable and incremental to maintain the connection between the process and its visual representation. The IDEF-0 technique allows the model to be scalable and incremental because all the elements that can be connected to scale-up activities are tracked in the mapping.

Thinking about a process, we often refer to a physical process that involves physical elements and flows connected to a particular product. However, "process" can also refer to information or data that robust process modeling should be able to represent. IDEF-0 allows representation of both the physical and data processes by using the same model.


IDEF Integration with Other Tools

When the process is being modeled, the information gathered can be used to support other activities, such as risk management or statistical modeling. As a basis for any risk analysis, a clear process representation of all the elements involved allows identification of all potential sources of risks. As mentioned, a systematic process map evaluates all the inputs or parameters related to a specific unit operation connected to the process itself, and reflects their operative meaning.

Furthermore, process modeling can make a clear distinction between what we know, what we believe we know and what we do not know. It makes the user aware of the level of knowledge about the process and provides the ability to implement actions and perform investigative activities. On the basis of process modeling integrated with risk management activities, the level of process knowledge and control is identified.

The process model should also be the basis for statistical analysis. The data gathered with the IDEF-0 can be either qualitative or quantitative, and can be used at different stages of the analysis depending on the level of detail required. Nevertheless, when done correctly there is no need to re-do the mapping in order to collect different sets of data.

When analyzing a system, the main objective is to build a visual representation of the system. Very often, if the system is affected by some variability or is very complex, then a simple process mapping exercise is not sufficient for a significant analysis. Building a model can be the solution. Beyond the regular advantages of mapping tools--such as understanding the process, sharing and communicating information--process modeling helps to make the analysis consistent and robust.

Process modeling facilitates accelerated system knowledge and learning, and makes it possible to manage complexities in the system. The IDEF-0 tool enables construction of a shared platform of information and adds value to the efficiency and competitiveness of the industrial processes.

The correct usage of process information, expressed or hidden, is important to make decisions that permit one to satisfy organizational and product requirements in terms of business goals, quality and safety.

As a standard, IDEF-0 is extremely flexible, but it has not been specifically built to be used for specific processes or to transfer to other tools (i.e. risk analysis tools). After several years of using these tools in various environments and consistently applying the logic mentioned above, I can see the IDEF-0 as a sort of knowledge acceleration tool. We have our version of IDEF-0 software and it is easy to see a further step of process modelling becoming a knowledge acceleration tool by offering:

* a high level of flexibility

* a scalable and incremental model

* traceability of all elements

* robustness and realtime updating

Process modelling tools, and in particular the IDEF-0 tool, provide users with real added value, not only for a better process understanding, but also to create a platform for critical activities and to support decision-making processes.

These advantages and benefits are linked to the difference between process mapping and process modeling. Process mapping allows groups to fix the process status. Even though its quick and easy to manage, process mapping does not assure traceability or sharing of an up-to-date representation of the system.

Model building using an appropriate modeling tool gives the user the necessary rules to achieve consistency and self-checking. Consistency refers to the structured way used to draw the diagram and to gather all the information connected to the system. The risk of missing or forgetting some process element is minimal and controlled. Furthermore, the model is self-checking, which gives assurance that all the critical elements are considered by working step-by-step and building a shared vision of the process.


By Paolo Mazzoni PTM Consulting

Paolo Mazzoni is founder and chief executive officer of PTM Consulting, a global company specializing in product and process development and optimization, from early stages to industrialization and full-scale manufacturing, for pharma and medical device companies. He can be reached at
COPYRIGHT 2012 Rodman Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2012 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Process Mapping
Author:Mazzoni, Paolo
Publication:Contract Pharma
Date:Mar 1, 2012
Previous Article:Whither quality? Why are we seeing so many quality problems?
Next Article:Packaging serialization update: FDA reviews obsolete 2004 barcode rule.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters