Collaborative Production Management in the process industries.
Any discussion about the use of plant operating data with plant management, operational staff or suppliers of plant information management systems will quickly spawn one of the above cliches. The cliches are right. We need to improve our ability to make efficient use of our automation and plant information data system investments. But the questions are how and where is the value truly being delivered to the organization?
CALL TO ACTION
This call to action is being driven by competitive pressures requiring mills to maximize capacity utilization and productivity; optimize operational performance; and ensure compliance with environmental, safety; and corporate responsibilities. The impact of this effort has been a decrease in operating resources in all process industries, not just pulp and paper. To put the challenge in perspective, Figure 1 shows the number of refineries in North America and their crude processing capacity. From 1980 to 1998 in the oil and gas industry, total staffing decreased from 700,000 to 300,000. Clearly, we are being asked to do more with less. Data management is an essential element of the solution to this challenge.
[FIGURE 1 OMITTED]
DATA TO KNOWLEDGE
Over the last 20 years, all of the process industries, including pulp and paper, have invested heavily in automation and plant information systems so that the data is now accessible in most mills. As a result, we should now be able to put it to productive use. Or can we? The challenge with raw data, no matter how accessible, is that data still requires a lot of work before it can be turned into knowledge. In most cases, the data needs to be validated, analyzed and converted into a level of knowledge that is actionable. This can require a significant investment of time and resources.
The Key Performance Index (KPI) has been the first step in putting data into a context that is more aligned with organizational goals. Every plant functional group has high-level objectives and targets, and if the raw operational data can be converted in real-time or near real-time into KPIs, then non-compliance to operational targets can quickly be identified, decisions can be made and actions taken. While converting this data into contextualized KPIs is a necessary first step, it does not guarantee the desired operational improvements. If the KPIs are not managed effectively, companies often simply transform the problem of "data overload" into the problem of "KPI overload."
As an example, consider the application of "Control Asset Performance Management" (CAPM). In the paper, chemical, oil and gas industries, 75% of a plant's physical assets are under some form of automation or process control. Companies are now focused on the fact that optimizing control performance can improve plant performance by 3% to 5%, with little or no additional capital investment. Thus, the objective of the CAPM program is to automatically collect the raw data from the DCS control systems and then convert this raw data into higher level KPIs, such as utilization and performance. As shown in Figure 2, most CAPM programs, such as Matrikon's ProcessDoctor, will convert real-time measurements of controller operating mode, present value, set point and output into daily indicators such as variance index, oscillation index, valve stiction index, utilization index, economic performance index, etc. As a result, it is much easier to understand whether the control system is performing optimally, i.e. according to the plant's operational and business goals, by monitoring these high level utilization and performance-based KPIs.
[FIGURE 2 OMITTED]
The consolidation of raw data into KPIs or performance metrics is a necessary step, but if not managed carefully, it will simply change the nature of the problem. If we consider the CAPM example above, a mill faced with the challenge of monitoring and sustaining the performance of 1,000 control loops may find it every bit as difficult to act on the results of a CAPM program that computes several KPIs per control loop (and hence thousands of KPIs per day). Unfortunately, the transformation of data into KPIs alone seldom delivers the true improvements we are seeking.
THE NEED FOR VISUALIZATION
The visualization layer is an essential element to getting the value from any KPI-based monitoring system. We have all seen the promises of the "digital dashboard" and speedometer-like displays of plant efficiency delivered in real-time through a web-based environment. But the true power of the visualization layer is its interactive ability to quickly sort and display the consolidated performance metrics in order to highlight the high priority requirements and provide guidance on the actions required.
This is performed through a combination of filtering, sorting and drill-down type analysis techniques. More sophisticated visualization techniques, such as ProcessDoctor's patented Treemap Technology, which can allow users to visualize hundreds of assets in a single view and rapidly identify the key focus areas, are now delivering a step change in our ability to act rapidly on the information presented within a KPI-based environment.
Internal studies at Matrikon in the area of CAPM have shown that well-designed metrics, combined with powerful visualization techniques, can allow plant personnel to improve the identification of high priority automation problems by 100%. More importantly, they can complete the task in less than 10% of the time required when using traditional analysis techniques. Figure 3 (see previous page) shows examples of both the sorting/filtering and Treemap visualization layers applied to CAMP.
[FIGURE 3 OMITTED]
WHAT ABOUT WORKFLOW?
By today's standards in the process industries, any company with a real-time, web-based KPI environment for operational views and decision-making is considered a pacesetter in its effective use of data. So has the "real-time web-based enterprise" delivered on the promises? And what is the next step for these pacesetters?
To realize the full value these systems promise, you must act on the meaningful knowledge they generate. This requires integration with the plant's workflow processes. The consolidation of data to KPIs and its visualization is often still deployed in a data-centric view that places it in a functional silo. If we consider CAPM again, many systems that compute the automation layer performance metrics and present these through a set of visualization tools are designed only for process control engineers. In effect, the information is channeled though a human funnel before it is dispatched to the wider group of resources who have to act on it. This model, as shown in Figure 4, neither empowers the organization nor facilitates work processes. To ensure that action is taken to correct problems, we must move from a data-centric view to a functional "process-centric" view, where the system can directly support the higher-impact business processes.
[FIGURE 4 OMITTED]
As an example, consider the case of a poorly performing valve on a boiler gas supply. This poor performance, because of valve wear or mechanical complications, can cause serious process upsets and a possible unit trip. A traditional CAPM program would only consolidate the valve and control data into performance metrics for the process control engineer to review. But the impact of this poorly performing valve has significant economic impact on the mill, with a scope well beyond that of the process control engineer.
From a functional or business process perspective, the poorly operating valve has a significant impact on the operation of the mill and should directly impact decisions made by the following functional roles:
1. Maintenance: Must have an understanding of the maintenance requirements and potential failure of the valve along with priority level to ensure an action plan is in place for replacement or in case of a shutdown.
2. Operations: Must understand through both CAPM and alarming information the rate of degradation in performance and the risk of unit trip or required shutdown.
3. Process Control: Must understand the degradation in control performance and the root cause along with the economic impact associated with poor performance.
4. Process Engineering: Must understand the impact on overall mill performance and the cost associated with poor control.
5. Management: Must understand the current economic loss because of poor performance and potential future losses due to trips or shutdowns.
This necessary distribution of knowledge requires an understanding of various relevant functional roles, but also requires integrating several data sources or knowledge bases. Both the data and workflow requirements for this to happen are shown in Figure 5.
[FIGURE 5 OMITTED]
Although the boiler valve example might be an extreme case, it demonstrates the need to understand the overall data and workflow requirements if these systems are expected to support business processes and deliver their full return on investment. Without workflow integration, the promises of the integrated operating environment will always exceed the reality delivered.
ENABLING THE WORKFLOW PROCESSES
Collaborative Production Management (CPM) is often defined as a method to unify disparate systems to achieve operational excellence. This unification must be performed along two lines. We must combine the data/information layer and the functional layer into a single workflow environment. This will allow plant resources, from operators to managers, to get away from complicated workflows where they must interface with multiple systems in order to assess situations and perform tasks. This unified workflow environment enables collaboration and helps the different functional roles work together with an understanding of their specific requirements in the context of a view of the bigger picture.
If we again consider the boiler gas valve malfunction, the data/knowledge integration requirements and the functional user-level integration are shown in Figure 6. The sharing of the data, knowledge and functional views ensures that each functional group understands the operational situation and its role in improving it. In essence, this is the integration needed to truly deliver on the promises of collaborative manufacturing.
[FIGURE 6 OMITTED]
Most companies that set out to achieve operational excellence through a "web-enabled real-time enterprise platform" are actually hoping that it delivers the collaborative production management environment described above. For companies that are successful, the benefits to the organization are significant and the implementation will change the way people work. Typical benefits include:
* Improve capacity utilization (3-5%)
* Increase equipment reliability (5-8%)
* Improved compliance reporting (environmental & safety)
* Improved efficiency and productivity of staffing (10-50%)
* Improved energy efficiency (5-15%)
THE NEXT STEP: EXCEPTION-BASED MANAGEMENT
So what does the future hold for CPM? The push forward will not end with the real-time integrated environment providing seamless access to data and streamlined workflow. Rather, decision-makers will push for even greater efficiency by minimizing the time people spend asking questions and monitoring KPIs. Rather, users will be alerted when things go wrong (and when they go very well). Thus, an information delivery system, based upon predetermined business targets and logic, will alert users of noncompliance to goals and give them insight into the situation, as well as an action plan to resolve the problem. Finally the system tracks noncompliance through a resolution tracking environment and ensures the item is dealt with in a timely manner.
We have all heard the promises of how more data and more knowledge will deliver significant benefits to plant operations. But before we embark on building the "real-time enterprise" and providing seamless access to every piece of data, it is important to understand where and how the value is truly delivered. Data access, KPI generation, digital dashboards, web-based visualization and a collaborative workflow environment are all essential pieces of the puzzle. We also need to walk before we run. Understanding the stages and having a strong vision of where you need to go are essential first steps in adopting a staged approach to a successful Collaborative Production Management System.
RELATED ARTICLE: WHAT YOU WILL LEARN
* How information can deliver value to pulp and paper companies
* How to use a Key Performance Index (KPI) to put data into a context that is more aligned with organizational goals
* The future of Collaborative Production Management
* "PIMA IT: Collaboration and integration keys to success," by Ted McDermott, Solutions!, November 2005. To access this article, type the following product code in the search field on www.tappi.org: 05NOVSO06. Or call TAPPI Member Connection at 1-800-332-8686 (US); 1-800-446-9431 (Canada); +1 770 446 1400 (International).
* "Mill technology utilization-UPM's holistic approach," by Ted McDermott, Solutions!, May 2005. Product Code: 05MAYSO51
* "PIMA IT: don't just survive, thrivel, Solutions!, November 2004. Product Code: 04NOVSO45
MIKE BROWN, MATRIKON INC.
ABOUT THE AUTHOR
Mike Brown, M.A.Sc., P.Eng. is vice president, technology, Matrikon Inc. He has served for the past 15 years as an advanced process control consultant providing implementation expertise and technology guidance for a large number of operating companies. Mike sits on the ISA-SP18, Instrument Signals and Alarm Standards committee, where new alarm management standards and practices are updated to reflect the changes in automation control systems and alarming functionality. Contact him at 1.780.448.1010, or by e-mail: email@example.com.
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||PROCESS CONTROL|
|Publication:||Solutions - for People, Processes and Paper|
|Date:||Jun 1, 2006|
|Previous Article:||Top 10 global paper companies: 'shrink to fit industry' grapples with overcapacity.|
|Next Article:||Upcoming events.|