Putting Data to Work
Once a CMMS collects data, it’s up to managers to ensure they use it to make smart choices
Editor’s note: This article is the second part of a two-part article on CMMS data management. The first part, in our December 2005 issue, discussed data-collection strategies.
The accuracy of data collected from a computerized maintenance management system (CMMS) ultimately lies with management. It is the job of maintenance and engineering managers to write procedures, perform training, enforce data-identification requirements, and review employee performance in order to ensure data accuracy and success.
It is just as important that front-line technicians understand the way that these activities affect the organization. Showing technicians the link between data and the potential impact of non-conformance is a great learning tool.
This process also can help instill a sense of awareness, promote ownership of the system, and illustrate technicians’ role in the organization’s overall long-term success.
Interpreting Data
The way a CMMS categorizes work-order data can lead to dramatically different statistical results. In turn, this can affect a manager’s decision on department operation.
The two figures below present one year’s worth of data collected from the activities of a 20-person maintenance shop. Figure 1 shows collected data broken down by work-order type, while Figure 2 shows the same data, but classified differently.
Because each figure presents a different set of circumstances, a manager’s conclusions based on the data will differ. For example, organizations often earmark budget dollars for specific activities, such as the maintenance of real property and installed equipment.
First, note the percentage difference in service hours. Figure 2 indicates that technicians spend less time performing core maintenance functions, compared with Figure 1. If Figure 1 data serves as the basis for the annual budget, this difference could violate department guidelines describing the use of maintenance-funded staff-hours, which could lead to reduced funding from this source in the coming budget cycle.
Also, some organizations charge back customers for performing non-maintenance services. Assuming this is the case in these two examples, a manager using data from Figure 2 could assume that customer accounts would pay for 15 percent more of the department’s full-time equivalences (FTE) — or three more technicians — 20 employees x 0.15 = 3 in Figure 1, versus 20 employees x 0.30 = 6 in Figure 2.
Next, Figure 2 shows a smaller repair percentage than Figure 1, while both show a consistent share of preventive/predictive maintenance (PM/PdM). Because Figure 2 data indicate fewer components are breaking down, a manager using this data could assume the PM/PdM program is working much more effectively than originally anticipated.
Managers also can extract data and compare PM/PdM activities to repair activities. Adjusting the data to a 100-percent scale reveals dramatically different PM/PdM-to-repair ratios, as well as vastly opposed organizational profiles — 57.5 percent PM/PdM to 42.5 percent repair in Figure 2, and 46 percent PM/PdM to 54 percent repair in Figure 1.
In many organizations, alterations and modifications represent resources spent on modernizing facilities and equipment. A manager operating under these assumptions could conclude that the drop in repair rates in Figure 2 results from its 5 percent increase in modernization work.
If such work traditionally is performed by outside contractors using funds from the capital budget — instead of funds from the operations budget — a manager might expect, incorrectly, a labor surplus that either can be re-assigned to new activities or can serve as justification for a reduced work force.
An Accurate Foundation
These examples might oversimplify the way differing statistical profiles can affect decision making. But while historical work-order data should accurately reflect technician activities, relying solely on data undermines the goal of making a fully informed decision.
Analyzing budget reports, reviewing commodity use, and meeting with supervisors and technicians are examples of steps managers need to take to corroborate findings before making important choices. A work-backlog forecast that contradicts statistical indications of shrinking maintenance requests — as well as a budget report that lacks the evidence of expected surpluses from an increase in chargeable services — would refute the statistical analyzes above.
Clearly, managers need to use a well-rounded approach to verify indications from data and to create a higher degree of viability. But regardless of whether all indicators agree, they tell a story about the department just the same. The reasons they don’t corroborate are just as important as the reasons they do.
The process of making good decisions is built on a foundation of accurate data collection and identification. Understanding data’s origins, developing management controls, training responsible employees, and performing reviews to verify its authenticity are essential functions for operating a maintenance organization in a business-like manner.
Asking important questions about the intent of such a process and setting expectations for the results are two good ways to start. Among the important questions are these:
-
What data categories must be analyzed?
-
How will managers use or interpret the results?
-
Who or what will influence data sources?
-
How will the CMMS format, generate and present the data?
-
What procedures, training, and follow-up activities will be needed to ensure success?
The answers will vary by organization, but the ultimate goal is the same — a sound data-management strategy that never leaves a manager wondering if work-order data is telling the truth.
It is difficult to overemphasize the need for a manager to review closed work orders. This step is the only way to verify employee performance with any certainty and to ensure that data generated using the CMMS portray the department accurately.
Managers can achieve this goal by creating reports that show all appropriate disposition criteria and field comments for work orders closed in a given time period, usually the previous day or week. Managers then can review this information for accuracy, and they can highlight items that require modification and return them to data-entry personnel for correction.
The process becomes an effective training tool that fosters an atmosphere of continuous improvement and demonstrates the department’s commitment to data reliability. The process also allows managers to benchmark error rates, highlight recurring mistakes, and schedule follow-up training so the process can improve as needed.
Frank Lucas is assistant director of work management for the University of Nevada in Las Vegas.
Data Decisions: Employees’ Role
Some organizations have the resources of an extensive work-control operation to enter data into a computerized maintenance management system (CMMS) and review it for accuracy. Others need to rely on a limited clerical staff or technicians themselves.
Whatever the situation, employees play a central role in the way the organization is portrayed statistically. It is absolutely vital that technicians write good comments, not only for management’s purposes, but also to serve as a viable maintenance tool for learning, diagnosis, and prevention.
Debate continues over the kind of information that constitutes a good comment, and clear answers are hard to define. While comments shouldn’t be novels, they should contain enough information so managers can determine appropriate qualifiers, based upon the definitions established by the organization.
For example, if a work order was created to “cut a key”, a comment of “done” or “complete” might be good enough. But a work order created to address a “too hot” complaint will require more elaboration to describe steps taken to solve the problem.
Besides this requirement, organizations might require additional information, such as the operating condition a piece of equipment was left in, fill-in-the-blank answers to task questions, and follow-up activities that might need to occur.
While good comments are vital, it is equally important that employees interpret, classify, and code the comments in a manner that truly represents the work performed. As alluded to earlier, statistical reports are not generated directly from field comments but from identifiers that represent field-comment text. So it is important that it is categorized based on organizational guidelines.
If field comments do not allow for accurate determinations, the technicians must add more information to the work order. Work orders also might include ancillary materials in the form of check sheets, logs, and other attached documents. Data-management procedures must address guidelines for handling this information, as well.
— Frank Lucas
|
Related Topics: