By Kevin Price, Vice-President for Portfolio Strategy and Enablement, Asset Lifecycle Intelligence Division, Hexagon

In the past two years, the UK has faced an increasing number of extreme weather events that have revealed the vulnerability of its power grid. In February 2022, Storm Eunice damaged the country’s power grid in 1,800 locations and left 90,000 homes in South West England without power for several days. The power supply was also disrupted by heat waves during the summers of 2022 and 2023, with temperatures over 40°C causing power lines to swell.

The challenges posed to utility providers by climate change are compounded by skill shortages in the industry, in a period when the baby boom generation is retiring in large numbers: 20% of the existing energy workforce is predicted to retire by 2030. According to Manpower’s 2023 Global Talent Shortage Survey, utility providers are also the employers that encounter the greatest difficulties in finding skilled talent for their needs.

Today, leading water and electricity companies are hoping that artificial intelligence and machine learning can help them reduce the impact of both problems by using asset data to identify impending failures, incorporate external information such as weather events into their systems and optimise the productivity of their maintenance teams.

Diving into troves of historical data

Implementing machine learning algorithms is not without its challenges. Utility companies typically own vast amounts of operational and maintenance data, such as work and outage history, real-time condition data and energy consumption patterns, which can all play a central role in making artificial intelligence models effective. However, this data often remains undigitised or stored in disparate systems, making it difficult to cross-reference the impact of external factors, such as temperature and weather conditions.

To find relevant patterns, the first step is to create a registry of all the utility provider’s assets – from power and water treatment plants to distribution lines to service trucks – and track relevant metadata including where the asset resides, its model, condition and documentation, and who is responsible for its upkeep. This data needs to be consolidated in a single platform, such as a digital twin or an enterprise asset management (EAM) system. 

Forecasting the impact of future events

Once the necessary asset data has been gathered, utility providers can use algorithms to drive appropriate actions or employ machine learning models to perform “what if” analyses and predict what could happen under varying circumstances. By feeding in external data sources, such as weather forecasts, or satellite images, they can understand how different patterns of weather can affect the performance and longevity of their equipment. 

There are different types of weather patterns that can affect equipment, but utility providers can precisely understand what condition the equipment is in because they are routinely checking and inspecting it on a regular basis. This information lets them predict failure points and make sure they have the right people available when a storm is coming. For example, a utility company facing a Category 3 storm can use modelling to determine whether its pumps will be able to hold up to the expected storm surge and trigger work orders to inspect the predicted failure point.

Adapting to varying levels of competency and data skills 

As the use of AI becomes more widespread, utility providers often struggle to find data scientists with experience in their field. According to research from the UK Government , there are currently 215,000 roles for hard data skills that need to be filled across the UK. It’s particularly difficult to find data science engineers who understand how to capture and apply data such as weather patterns and equipment data failure, however, data-scientist roles are starting to be filled. 

The solution is twofold. First, similar analyses can serve as templates across the industry as algorithms and data are increasingly being democratised, evolving into more accessible templates. These can be applied vertically, especially in sectors like utilities or energy, applying overall equipment effectiveness ratings that have been in use for a long time.Today, algorithms can also be incorporated in ways that require less technical skills from end-users. An EAM solution often includes a framework that allows technical users to run scripts in Python, the standard programming language for data science models, but also allows non-technical users to use existing libraries.

When implementing machine learning, it is important to remember that algorithms are only as good as the data they are trained on. A crucial factor is therefore the quality of the data collected in the field, especially by field workers during inspections and repairs. How easily they can enter information and enrich it with pictures or metadata is a key factor to effectively leveraging machine learning algorithms. In the world of utilities, even the most sophisticated artificial intelligence is still bound by the quality of human-generated data.