January 21, 2025

Guest Editorial | Three Key Steps for Successfully Bridging the Gap Between IT and OT

by Ben Dwinal, TRC

In the last issue of Electric Energy T&D Magazine, my colleague Craig Cavanaugh discussed a topic that should be on the radar screen of every utility: the importance of breaking down the walls between IT and OT systems. His article, “Bridging the Gap Between OT/IT: A Critical Step for Navigating a Perfect Storm of Challenges,” makes a compelling case for why the separation between IT and OT systems has become an enormous hindrance to utilities. As he puts it so well in his commentary:

“Maintaining that gap between OT and IT may once have been a recipe for reliability, but today it is a liability in the face of the massive changes facing utilities. To solve the multi-faceted challenges that the industry is confronting, utilities must build a bridge between these systems as a foundation for success.”

Cavanaugh then discusses the critical role that convergence of IT and OT plays in unlocking the ability of utilities to respond effectively to challenges such as grid resilience, climate change, rapid electrification of the economy, graying of the workforce and more. His article ends with a call to action for utilities to start bridging the gap between those two technological systems.

Once an organization understands that urgency and wants to take action, what is the right strategy for moving forward? It’s a question I am asked regularly, and the answer I give usually surprises people. One might assume that the conversation about such a technology issue is all-encompassing, and utilities need to start by taking a 50,000-foot view of all the IT and OT systems that a utility uses. I understand the instinct to take a macro view by thinking about bits and bytes of data. It may be counterintuitive but getting down close to data in this way will give your organization clarity about how to shape your overall strategy.

In the IT/OT convergence initiatives my team and I have worked on with utilities, three guiding principles emerge when you take this micro view:

  • Enabling continuous, real-time data flows is critical
  • Cleaning and enriching data sets should not be overlooked
  • Ensuring data access at work sites is another critical success factor
     

Let’s start with data flows. Legacy IT and OT systems typically perform data aggregation, reporting and analysis periodically. Rather than being moved continuously through the system to support real-time operational needs, data often sits and sits and sits. A piece of data from a sensor or a piece of equipment may sit in a holding pattern for 12 hours, an entire day, or even a full week, until a pre-determined system schedule time. Periodic execution of batch processes to synchronize data across the enterprise may have made sense in the past, but it is a massive obstacle today for the kind of real-time, continuous reporting and analysis of data that is needed for managing smart grids, preventing outages, supporting real-time decision-making by field crews, and much more.

I should note that the negative impact of this is not just on big IT/OT initiatives. It also produces huge operational inefficiencies that cause headaches, slows down workflows, interferes with decision-making, increases truck rolls and adds to operational costs. Addressing this need for continuous, real-time data flows therefore delivers significant and impactful efficiencies in addition to giving utilities a foundation for solving the big challenges that Craig previously discussed.

Focusing on continuous, real-time dataflows should be the guiding principle that drives the strategy for IT/OT convergence. When you begin examining your technology systems to see what stands in the way of real-time aggregation, reporting and analysis of data, it becomes very clear what changes need to be made. As you follow data from where it originates to where you want it to go, your strategy should be to systematically remove the barriers that stand in the way of faster dataflows. This will drive key decisions the organization will make about hardware, software and cloud architecture at the macro level.

The second guiding principle for the success of IT/OT convergence is that data quality matters. Having real-time, continuous dataflows is critical, but the accuracy and richness of the data are just as important. That is particularly true for location-based data, such as the layers of data that are embedded in digital maps and GIS databases. To maximize the insights that utilities can derive from data, accuracy and richness are all critical. But not all data is created equal. A significant amount of data in utilities’ IT and OT systems needs to be scrubbed, made more accurate and enriched. Flawed data is no secret to those who work with it every day at utilities, including crews in the field and GIS professionals. It creates operational inefficiencies that slow down mission-critical decision-making and the process of completing projects daily. And it also stands in the way of any large-scale initiative to bridge IT and OT systems.

Enhancing the accuracy and richness of utility data has traditionally been a time-intensive process that makes it difficult and costly to do at scale. But that is changing thanks to AI. AI-driven “data conflation” is being successfully used to automate the process of assessing the quality of data, correcting inaccuracies and enriching the data to bring it up to the standards needed by the applications across the organization. This “data conflation” topic could fill up an entire article on its own, so I won’t go into too much detail here. The key takeaway is that the process of data enhancement can be automated in ways that remove one of the obstacles to IT/OT convergence and solving the macro challenges that my colleague mapped out in his article.

The last guiding principle for your IT/OT strategy is to look closely at data in a very specific environment: in the field where work crews need to put it to use on-site. The tablets that field crews use for processing IT and OT data typically have onboard GPS and mapping engines. That means that data and processing can be performed at the device level filtered by exact location, not in a centralized data center. In many cases, that work is executed offline because work sites may not have adequate connectivity. For these reasons, back-end systems need to be very deliberate about what information is delivered to work crews. Too much information would overwhelm the tablets and the apps. Not enough information would interfere with workflows and even cause a project to be delayed. Outdated, inaccurate information also leads to confusion and delay.

Looking closely at data accessibility in the field will steer you through important decisions about how to bridge the gap between IT and OT. Evaluating role and workflow-specific data needs for your field teams can reduce data volumes making offline analyses and access to information more achievable. Using a device's current location as a data filter to reduce the amount of data being processed, presented, and refreshed can be another strategy that will enable critical field workflows to be executed without overwhelming device resources or back-end systems. The use of geospatially enabled applications and map-centric interfaces can be a key enabler in allowing real-time, operational data to be delivered to field crews and business leaders improving decision-making abilities, reducing inefficiencies, and increasing organization-wide awareness of on-the-ground conditions.

By getting up close to your data in these ways, there are clear lessons that come into focus for successfully bridging IT and OT. But as you saw with each of the examples above, there are also significant operational efficiencies beyond supporting a successful IT/OT strategy. The steps you take to improve dataflows, data quality and data accessibility in the field will also allow you to eliminate inefficiencies, reduce the time it takes to successfully complete workflows and accomplish more with the team you have. The benefits are not just on the horizon. They are immediate and impactful.

Ben Dwinal is the vice president of Solution Architecture at TRC. Dwinal is an experienced solution architect with over 25 years of experience in remote sensing, geospatial intelligence, application development and related technologies. His background includes 12 years in military and defense technologies and more than 13 years of work in the GIS electric and gas utility industry. He earned his degree from the University of Maine.