The Importance of Master Data in Enterprise Asset Management
CIOREVIEW >> Enterprise Data Management >>

The Importance of Master Data in Enterprise Asset Management

Scott Barrett, Senior Director, EAM Solutions, Utopia Global
Scott Barrett, Senior Director, EAM Solutions, Utopia Global

Scott Barrett, Senior Director, EAM Solutions, Utopia Global

In today's information-rich culture, it's easy to be amazed at the depth of data that is easily available. With a quick google search and an outlay of only $99, I can own a complete set of master data (over 30,000 parts, schematics, and maintenance/repair manuals) for my Toyota Land Cruiser. However, for a multi-billion dollar refinery, factory, or power plant, the same level of data is often impossible, or at least incredibly difficult, to locate. In this article we'll discuss why that is the case, and how organizations can address this issue.

For most heavy-asset organizations, the challenge begins during the construction/build process of a new facility. The Engineering, Procurement, and Construction (EPC) contractor, supported by innumerable sub-contractors and suppliers, design and build a turnkey facility. As part of the handover process, the EPC delivers engineering drawings, equipment manuals, parts lists, maintenance recommendations, and many other data-rich items. Historically this would consist of a hodge-podge of vendor manuals, documents and drawings, delivered by the pallet-load via 18-wheeler. Today it is usually delivered electronically, consisting of both structured and unstructured data. Extracting the parts lists, descriptions, etc. from the unstructured sources (documents, drawings, pdf's) is not simple. So, many organizations only bring in a fraction of the total needed data into the system of record. This results in systems that are of minimal use for critical areas such as Plant Maintenance.

So the most critical elements to address in the commissioning of a new plant/refinery/factory is to:

1) Identify the specific documents and drawings that are the primary source material for every essential information element;

2) Establish a robust workstream to ensure the data loaded into the operational system is cleansed, classified, correctly created, and consolidated. Most system integrators leave this clean-up work to the client organization, who are often ill-equipped to handle the task.

Recently, I was working with a large oil refiner in the Middle East on a Phase II project for a new refinery. In their extant Phase I system, they estimated that less than 25 percent of the data was actually loaded into their system. For years, this had severely hampered their ability to do critical maintenance. Thus increasing the time needed for shutdown/turnaround planning by as much as 50 percent. Resulting over-stocking of spare parts resulted in redundancies of as much as 100 percent. They vowed not to let this happen in their new brownfield site.

We recently moved back into our home, which we had been renting for several years. Before moving back in. I had it painted. The paint crew spent nearly three weeks at my house. They cleaned, scraped, sanded, painted, and stained every single paintable/stainable surface in/on my house. When they were finished, I wrote them a very large check. Then my wife and I sat back and enjoyed our fresh, crisp, perfect surfaces. And about five minutes later one of our kids put a black mark on the wall above the stairs, carrying up a load of stuff to their rooms.

It’s all well and good to get things cleansed and loaded in your operational system. But five minutes after go-live a myriad of systems, users, and processes begin to make changes to the as-built plant. Unless the asset master data is updated to reflect the ever-changing configuration of the as-built assets, the usefulness of the asset master data quickly erodes. Many organizations have made the painful discovery that ‘get it clean’ investments in asset master data are wasted without corresponding investments to ‘keep it clean’.

So let’s assume our heavy-asset organization did view their equipment and material Master Data as a critical element for the success of the new capital project, or as an area for improvement in an existing system. They then took the time to ensure that consistent, valid, standardized data was populated for fields like Equipment, Functional Location (FLOC), Bills of Materials (BoM’s), Task Lists, and Materials. Now a set of checks and measures must be put in place to ensure that thecreation/modification/deletion of this data is only done following the business rules that were development during the build/fix process. This new request must then be validated/ approved by a data steward, and finally approved by a manager. Preferably with a full audit trail, and hopefully in an simple and comfortable interface for the requester. Only then will the organization be able to sustain the quality of the data within the system.

"Many organizations have made the painful discovery that ‘get it clean’ investments in asset master data are wasted without corresponding investments to ‘keep it clean'"

Finally, after the data has been cleansed and loaded, and a ‘firewall’ of sorts has been established to maintain the quality of that data, a unified view of all the Master Data, both structured and unstructured, much be created. Let’s say, for example, that there is a particular type of pump that is used widely throughout the factory. The operational system will contain information about that pump (equipment description, functional location, component parts, etc). The engineering system will contain the structured CAD/CAM representations of the pump. The GIS system may contain the actual location of the pump, shown on a map, or the Linear Asset Management system details for transmission lines or pipelines. And finally the Unstructured/ Document Management system will contain user and repair manuals, spare parts exploded view diagrams, maintenance documents, etc.

In order to effectively work in the field, plan shutdown/ turnarounds, or otherwise get a complete view of a pump/pipe/ tanker truck, the worker needs to see all of this data, easily, in a single view or linked system. So the organization must ensure that they can easily accomplish all of these things.

So, in conclusion, heavy-asset organization must treat their asset data as a critical part of their infrastructure, just like physical pumps, or transmission lines, or a train engine. They must ensure the master data is clean, reliable, complete, and available for easy reference as needed. They much have a business process that ensures that this clean data is not allowed to degrade. The same business rules and processes which were used initially must continue to be followed. Finally, they must have a wide enough view of master data to ensure it encompasses operational/structured data as well as document/unstructured data, with linkages across the various systems so a 360-degree view can be obtained. Only then can asset-intensive organization feel comfortable in their ability to effectively execute new capital projects, shutdown/turnarounds, or other critical operations.

Read Also

What It Truly Means For IT Security To Bea Business Enabler

Richard Frost, Senior Cyber Security Manager, esure Group

Digital Transformation 2 Requires a CIO v2.x

Guy Saville, Director - Responsible for IT, Business Systems & Credit at SA Home Loans

Leverage ChatGPT the Right Way through Well-Designed Prompts

Jarrod Anderson, Senior Director, Artificial Intelligence, ADM

Water Strategies for Climate Adaption

Arnt Baer, Head of General Affairs & Public Affairs, GELSENWASSER AG

Policy is a Key Solution to Stopping Packaging Waste

Rachel Goldstein, North America Policy Director, Sustainable in a Generation Plan, Mars

Congestion-Driven Basis Risk, A Challenge for the Development of...

Emma Romack, Transmission Analytics Manager, Rodica Donaldson, Sr Director, Transmission Analytics, EDF Renewables North America