Article
Digital-led decarbonisation
The deployment of digital twins across the industry is accelerating rapidly. Despite this, the adoption of new solutions continues to be a challenge for owners. Many still experience difficulties in realising business value, either from optimising operations and maintenance, or by enabling new revenue streams.
Recently, Wood was featured in Gartner research that gathered lessons and best practices for delivering digital twin initiatives that generate lasting business impact. Unsurprisingly, the report highlights that reliable data acquisition, maintenance, and delivery are key to success.
The deployment of digital twin solutions is proving to be an important catalyst to establishing reliable data foundations for industrial assets. When you consider that most asset owners’ portfolios contain a mix of new, recent and aging assets, with some perhaps delivered on paper, it is inevitable that the widespread adoption of digital twins has started to highlight numerous data challenges.
While the industry has generally been effective at managing dynamic operational and maintenance data, for example by utilising process data historians and maintenance management systems, it is the engineering and asset data, 3D scans and 3D models that are proving to be the most challenging.
From our experience delivering capital projects, operating assets and managing data across the lifecycle, we believe there are three key aspects that help drive success.
The industry is actively collaborating to develop and mature data standards and interchange formats. Not only do these standards enable a level of consistency and predictability across the supply chain, they also enable operators to decouple data from authoring systems and proprietary solutions.
Current data standards are tailored to capital project delivery and the subsequent handover to operations. They define the data to be collected for each type or class of equipment, with the aim of achieving a complete and compliant dataset at the point of handover. However, these standards continue to have gaps when it comes to the operating phase.
More challenging still, a portion of the data handed over to operations has limited utility in the operations phase. Therefore, trying to sustain all the data within a handover standard over decades of operations is impracticable and costly.
Current industry approaches to data maintenance are varied, ranging from the blanket approach of trying to keep all data up to date, to extremely targeted approaches that only update data on request as it is needed. Regardless of the approach however, there remains a widespread challenge with keeping the most critical data up to date, particularly when changes are made to the physical asset.
In the similar way physical assets are maintained, there is a need to take a more balanced approach to maintaining data that strikes a balance between criticality, reliability, value impact, and the inevitable cost.
Digital twin users are exposed to large amounts of data, information, and insights. While only some of this may influence their operational decisions, our experience shows that any data errors, even those that are not relevant or critical, can significantly erode trust in all the data presented in the digital twin.
It is imperative that users have visibility of the origin of each piece of data, including when it was last updated and validated, enabling them to gauge its reliability. We have found that supporting this with a process that enables users to query data and request updates significantly builds trust and gains buy-in from users, enhancing digital twin adoption.
Effective digital twin solutions enable asset owners to gain insights, improve decision-making and optimise the operation and maintenance of their assets. But without a foundation of comprehensive, up-to-date and trusted data , challenges with adoption frequently occur.
Digital twin users need to build trust in the solution and data first for it to succeed. If the data isn’t up to scratch, experience shows that users won’t hesitate to vote with their feet, and resort to previous systems and ways of working. This has the potential to erode business impact.
The explosion of generative AI is bringing a whole new inflection to this data challenge. Without the same level of human involvement in assessing data and insights, the risk posed by data issues will be multiplied, perhaps significantly. Therefore. the case for holistic, robust data practices and standards has never been stronger.
Through our Digital Asset and DataOps solution, we deliver the building blocks for digital transformation. Discover how we can solve your data challenges and decode your digital future: