December 21, 2024

Guest Editorial | Redefining Resilience in Post COVID-19 and Why Adapting Your Data Strategy Matters

by Francois Laborie, Cognite North America

In 2020, resiliency has taken on expanded meaning as utility companies worldwide adopt new, evolving strategies for new, evolving norms. Defined as “the capacity to recover quickly from difficulties; toughness,” resilience is usually thought of in relation to the operator’s ability to bounce back from severe weather events and unexpected grid outages.

Today, this operational pillar remains as important as it ever was. In August 2020, Tropical Storm Isaias battered the East Coast and caused more than two million power outages. That same month, Hurricane Laura, a Category 4 storm, pounded the Gulf Coast for hours, leaving hundreds of thousands of people without power in Louisiana and Texas. Meanwhile, on the West Coast, California faced new, unexpected blackouts due to an intense heatwave, an overloaded grid and offline generation. Not only are these examples indicative of the pressures to come; they highlight the fact that operational resilience is a moving target, making it difficult to sustainably achieve related KPIs.

But because the complex grid system is composed of complicated, interrelated physical and digital elements, similar thinking about resilience should be applied to the workforce, customers and even the utility business model.

The traditional onsite working model, for example, is being disrupted by the pandemic as the need for safer, more flexible options for remote work arises. This includes a strong push towards more digital solutions and access to remote operations. How quickly can utilities empower this new model while facilitating a return to the same or growing rate of productivity? To accomplish this, utilities must explore the latest technology to promote the safety of their employees and ensure continuity of service for customers, while at the same time remaining flexible enough to adapt to a rapidly changing environment.

Then there’s the concept of customer resiliency: How can utilities in competitive markets provide improved, differentiated customer experiences that reduce churn and streamline customer management? While half of this is an operational problem – keeping the lights on – the other half depends on quickly identifying emerging customer risks and opportunities and being flexible enough to respond quickly with a unique offer or experience.

Lastly, utilities should think about resilience for their business models. This includes the means to quickly and accurately respond to market opportunities like renewable energy and microgrids or securing market share against new entrants to the market. Disruption can and will come quickly, in the form of new storage technologies and breakthroughs in generation. Is your utility resilient enough to mobilize a new strategy at the speed of the market?

The rapidly evolving age of data-driven resilience

It is no secret that improving resilience across the four dimensions mentioned above starts with data. Today, the US utility sector is investing heavily in data collection and analysis to better identify and interpret the signals that impact their operations. But because resilience is a time-bound problem, the value of this analysis boils down to the efficiency of the entire value chain by which data transforms into insight and action.

This is where traditional data management strategies start to break down. Existing strategies work against the speed and agility needed to deliver resilience in time-sensitive scenarios such as outages that must be resolved, or when competitive windows of opportunity are closing. The trail of failed proof of concepts that never quite make it into production is evidence enough of the need for a smarter, more data management strategy designed around the trajectory and potential of the business. This is where utilities should focus their attention now-especially during unprecedented industry change.

Why is the industry at this crossroads? First, more data than ever is being collected from new sources and is being stored across the enterprise. Secondly, analytics tools are becoming more mature, accessible and valuable, leading to a new class of data consumers (citizen data scientists, analysts, engineers, etc.). While these both should be net positives over the long term, they expose entrenched utility data architectures that are not set up to support the potential of digital use cases at scale.

Complexity: the enemy of data agility & digital productivity

This is the reality that many utilities are now waking up to an overly complex, compartmentalized data architecture that does not well facilitate data access, agility or scale –all key requirements of healthy, prosperous and ROI-delivering digital transformation.

Until data usage is ubiquitous to more consumers and more applications, utilities will struggle to rapidly act on their initiatives around resilience. On one spectrum, utilities will face physical barriers to data ubiquity that center around fundamental connectivity and access to the relevant data silo(s) that must be tapped to solve the problem.

Take for instance the health monitoring of a transformer. Data from the transformer control system (one silo) is useful for analyzing performance trends over time. But this set of information is only part of the overall maintenance workflow: engineers must also correlate documented events such as repairs and faults which exist in a separate system. Additional information such as weather patterns and demand forecasts exist unconnected in other silos, further complicating the monitoring, diagnostic and decision-making process.

This access problem may be simple enough to overcome for a straightforward use case or a static proof of concept, but what are the implications for when the application moves from development to production? Will the data flow from the silos to the application remain intact? What other security or quality issues will manifest?

In addition to these physical barriers, utilities will also face knowledge barriers having to do with the specialization of these data sets. Just as it is not feasible for every citizen data scientist or developer to become a subject matter expert in the data they’re working with, it is also not efficient for the data’s subject matter expert to help every data consumer understand the data set. This highlights the need for a better way to improve how embedded data knowledge is contextualized across types, sources, silos and domains. Fortunately, these problems are being addressed head-on, with the next wave of industrial data management software.

Overcoming digital complexity with DataOps as a best practice

While there’s no slowing down the evolution of the modern digital infrastructure stack, implementing emerging DataOps practices is one of the few concrete steps to get ahead of the impending digital complexity. Just as DevOps redefined the process for application development, so too is DataOps redefining industrial data management practices for utilities. Gartner defines DataOps as “a collaborative data management practice focused on improving the communication, integration and automation of data flows between data managers and data consumers across an organization.” As an automated, process-oriented methodology, DataOps can improve the quality and reduce the cycle time of data analytics. DataOps uses technology to automate the design, deployment and management of data delivery with appropriate levels of governance, and it uses metadata to improve the usability and value of data in a dynamic environment. Put simply, DataOps enables efficiency and effectiveness for a previously complex and time-intensive process.

DataOps focuses on enabling the entire data value chain from the point of collection, through contextualization and analysis, and ending with the distribution of digital applications. To date, the digital and analytics software market has provided solutions to individual pieces of this data value chain (think data historians, AI & Machine Learning), but this is exactly where the complexity exists. Given the fragmented market, fundamentally sound DataOps practices must, at a minimum, solve for these following holistic data problems to return significant value.

  • Large scale management, contextualization and transformation of structured & unstructured data
  • Automating quality checks & enriching data for advanced analytics, hybrid models and ad-hoc reporting
  • Comprehensive query & visualizations with open delivery and distribution of data + insights

By integrating DataOps into their digital solution stack, utilities can reduce the overhead (and lower the marginal costs) of data management and contextualization and execute more effectively towards use cases that are tied to resilience. For example, being able to automatically map the relationships that exist across a network of transmission substations makes it then possible to query on operator-centric knowledge. They query evolves from “show me time series data from substation x, y, z” to “show me the events log from substations in zone F with transformers that are older than 20 years.” The initial subject matter expertise involved with knowing the right assets in question becomes codified in the data model so that others can leverage the mapping.

This has short and long-term advantages: Immediately, digital teams can equip the means to better solve prioritized use cases by the enablement of their workers with access to high quality, contextualized data. A common priority, for instance, is getting better operational visibility into the health of the entire grid network. Then, once DataOps practices are established and in place, utilities are better positioned to solve the long list of emerging use cases and nimbly adapt their digital architecture for future data types, sets and volumes. This includes solving the long-term challenges associated with successfully integrating DERs, microgrids and other technologies.

Avoiding the consequences of inaction

By addressing fundamental challenges with data infrastructure and culture now, utilities not only better serve their immediate business interests, but also avoid the challenges that will compound as a result of growing digital complexity and speed of change. First, additional data silos will be created, making it harder to access and mobilize the data required for a given business problem. Over time, these new silos will lead to a lack of trust in the overarching data quality required for advanced modeling and predictive machine learning-based analytics. Then, without proper automation of data contextualization and pipelining, fewer models will get deployed and at higher costs. This ultimately will result in longer adoption timelines, lower ROI and lost competitive advantage.

Leveraging the current industry inflection point to improve utility resilience

Out of disruption comes opportunity and creativity. As utilities re-examine their operations in response to changing customer patterns, volatile weather events and other macro pressures, now is the ideal time for executives to reevaluate and bolster their digital strategies for the next era of accelerated change. Given the growing needs for operational, workforce, customer and business model resilience, only by taking advantage of their data will utilities can evolve and adapt at the speed of the market.

Now to make this actionable, below are several strategy recommendations and best practices that may bring clarity in this time of uncertainty.

  1. Aggressively pursue the commoditization of data and delivery of insights
    Data will not cease to be a valuable asset to the organization and is positioned to continue expanding with regards to volume, types, sources and complexity. Because of this, the dominant utilities of the future will be the ones who are able to effectively collect, mobilize and apply their data efficiently and at the lowest marginal costs. But to do this well, leadership must take serious, honest stock of their current digital maturity in order to properly identify gaps and opportunities.
     
  2. Think beyond the traditional data KPIs.
    As business models shift, utilities are already shifting their KPIs to more performance-based. The same KPI thinking should be applied to data initiatives. Rather than focusing on the quantity of data collected, for instance, metrics around data quality, time to deployment, reduction of workflow steps, data onboarding time, etc., will matter much more moving forward and will also highlight the largest areas of current inefficiency.
     
  3. Help drive the industry’s most significant up-skilling to-date
    This is where the true value of industrial digitalization lives – in transforming data culture through ubiquitous access and more on-demand application to real-world problems. This requires investing not only in DataOps architectures and analytics tools but also investing in the change management and education of the experts closest to the business problems. When data barriers are no longer a problem, these experts can focus their efforts on what matters most.

Utilities have a significant opportunity to emerge from the COVID-19 crisis in a stronger position than they entered by doubling down on their digital strategies. By solving fundamental issues with data access, mobility, contextualization and operationalization, utilities can accelerate the way that data is used to respond to time-sensitive resilience use cases that exist across the organization.

Francois Laborie is president of Cognite North America, overseeing Cognite’s expansion and operations in the U.S. and Canada. Laborie has had an extensive career in the technology industry, serving in both research and executive roles. Before shifting to lead Cognite North America, Laborie managed Cognite’s overall marketing activities, including product marketing and the Cognite partner network, as Chief Marketing Officer. He has master’s degrees in computer science and engineering from the National Institute of Applied Sciences and a Ph.D. from the Toulouse Computer Science Research Institute.