December 26, 2024

Security Sessions | Secure? Who cares – it complies with the regulations!

by William T. (Tim) Shaw, PhD, CISSP / CIEH / CPT

William T. (Tim) Shaw
PhD, CISSP / CIEH / CPT

Over the last few years I have had ample opportunity to see organizations attempting to implement cyber security in industrial plants and facilities (such as power generating stations) all too often using guidance and best practices borrowed from the IT world. In many of the cases the results would be funny if they weren’t just plain sad. Even all these years after the initial FERC call to implement cyber and physical security at the SCADA/EMS facilities that watch-over and control the grid we still are having arguments over what needs to be protected and how to achieve adequate protect. Seems to me that we spent less time (and maybe less money) putting astronauts on the moon back in the 1960s.

Why has this happened?
Electric utilities and other bulk-power/grid entities continue to be ‘under the gun’ to comply with the NERC CIP standards, even as those standards are undergoing a lot of debate and discussion. Some entities have made good faith efforts to achieve regulatory compliance; but have not actually ended up making their critical cyber assets all that secure. They may have even expended a lot of time, money and manpower in the effort.

Some entities recognized the risk and liability that a major grid event would represent and invested in the right people and technologies and ended up establishing an adequate level of both physical and cyber security, but their program might not pass a CIP compliance audit due to missing paperwork.

Other entities apparently have a lot of lawyers sitting around and have treated the cyber and physical security regulations as so many legal documents that can be picked-apart and reinterpreted in order to find loop-holes and avoid, or at least delay, implementing much of anything. Watching them send endless requests for clarification and generating tons of paperwork in an attempt to bury the FERC staff is mostly boring and discouraging. I enjoyed watching one generating plant attempt to claim that each of their units was a separate plant so that they could fall under the 300 MW threshold and not apply any cyber protections. We all know how NERC responded to that attempt.

Other entities have handed the problem over to their corporate IT departments, the same people who just a few years back were not welcome in most industrial facilities because of their arrogant, one-size-fits-all approach to cyber security and their dangerous (and potentially deadly) acceptable IT practices that definitely did NOT fit well in an industrial setting. This last group provides me with the greatest number of incredible (but true!) stories.

A lot of those stories come from the fact that industrial facilities are filled with microprocessor-based (digital) devices that don’t look, feel or operate like a PC or server running any of the popular commercial operating system, and they can’t be monitored or protected in the same manner.

Devices like digital trend recorders, smart annunciator panels, ‘smart’ transmitters and other instruments, PLCs, RTUs and digital protective relays have blinking LEDs and might even sport a USB port or a memory card slot and an Ethernet NIC. These are generally referred to collectively as IEDs (no, not the kind causing deaths and injuries among our troops in the Middle East) or Intelligent Electronic Devices. IEDs pose a real problem to traditional IT personnel since they are driven to try and apply IT best practices to these devices without putting in the time and effort to find out how such devices actually work. So called ‘embedded devices’ are very different, in both hardware and software design, from your basic PC or server.

This past year I was witness to a heated discussion between plant I&C personnel and an outside IT consultant involving exchanging data between two separate systems without creating a cyberattack pathway. The I&C personnel had attached a PLC on to each of the two systems and then cross-wired some analog and contact I/O signals between the two PLCs. This allowed the transfer of a handful of process measurements and some operational status flags between the two systems. The IT consultant insisted that this was creating a potential attack pathway between the two systems because “Modems use analog signals and they transmit digital message traffic.” It was pointed out that you would have to covertly install some pretty tricky programming into both of the PLCs to make them do this and you would also have to put some interesting programming onto both of the systems to cause them to treat the PLCs as communication devices. And if you had sufficient access to make all that happen then creating a covert communication channel was probably superfluous. (One wag pointed out that it would be much easier to use the contact I/O to send binary messages, but the other I&C folks told him to keep his mouth shut till the consultant left the site.) Needless to say, a lot of time and effort was spent fighting over this issue (which ended up with the I&C folks winning the argument.) Basically the whole problem was caused by the total lack of knowledge of IACS and I&C technology on the part of the IT consultant.

In another case the IT consultant (a different one) was very concerned about a set of panel mounted digital trend recorders because they all had USB ports which could be used to load replacement firmware and to load/safe the trend data and configuration settings. He wanted them equipped with mechanical port locks so that the USB ports would not be accessible. The issue was over a new cyber concern called ‘Bad USB.’ In a PC when you insert a USB device into an open port the device sets its address to zero (0) and raises an electrical signal that tells the PC that there is a new device on the USB serial bus. The PC then queries the device, tells it what actual address to use and asks it to identify itself and what kind(s) of functions it supports: keyboard, mouse, communications, audio, bulk storage, etc. The PC then goes off to its huge pre-loaded library of registered USB device drivers (and you wondered why Windows needs a big disk drive?) and loads all of the drivers needed to interact with the device. This is called enumeration. The bad USB scenario is having a USB device that looks like a simple “thumb drive” but when plugged-in it tells the PC that it is a keyboard and then sends the PC commands as if they were being typed on the keyboard. Some of those commands would entail copying files from the USB device and installing and running them on the PC (I am simplifying this a bit, but that is the basic idea). A VERY important part of what I just said was ‘huge pre-loaded library of registered drivers.’ The digital trend recorders have no library of USB device drivers, do an abbreviated enumeration, and were factory programmed to only recognize and talk to USB bulk storage objects and only for basic read/ write operations. They had no library of drivers for other USB devices since it makes no sense to plug a keyboard or mouse into the trend recorders. So even though they had USB ports, these were what I call limited-functionality USB ports which, by their nature, would not be vulnerable to a Bad USB attack. Again, the issue here was assuming that a USB is a USB is a USB and they all work the same as on you PC. It took a technical discussion with the vendor’s engineering staff to confirm that this was the case and that the threat did not exist.

IT folks are very concerned about malware getting into devices and systems and corrupting them. A very reasonable concern since there is a huge and constantly increasing repository of malware and exploits that are designed to attack computers. Getting malware into a system doesn’t involve magic. If done remotely this means that the system being attacked has some ‘service’ (program) that is waiting to receive a connection request from some other program in some other computer out across a shared network (like the internet). Like being a company sales associate waiting to get an incoming call from someone wanting to place and order. To attack the system a hacker would have had to look through the program code of that service/program and find a spot where they can trick the program into blindly accepting a lot of data that will spill over into program memory and replace some of the existing program instructions (something called a buffer overflow attack). For this to work there are several requirements:

  • First that the attacker get their hands on the program code for the service they want to attack. Easily done with Windows, Linux, OSX or any of the commercially available software products since anyone can buy a copy of the software. Not so easy with an I&C product running vendor proprietary operating software.
  • Second that there be an exploitable flaw in that program code where you can send a lot of extra stuff. Most I&C products support industrial protocols where the protocol messages are strictly defined and where anything deviating will be treated as a bad message and discarded.
  • Third that the attacker’s inserted program code can call on the operating system of the device/system to do critical things like creating a file, starting a file transfer process, installing a new task, running a program, etc. Fine if you are running a commercial O.S. but unlikely in a smart device where there probably won’t even be a file system and definitely won’t be user-callable operating system functions.
  • Fourth, that you can overwrite existing program instructions with new ones that change the function of the device/ system. Not an issue in a PC where all programs, including the OS itself, are running in RAM memory.

But many IEDs have no hard drive and their basic program code is burned into some form of ROM which cannot be easily altered, even if that code is copied into RAM when the device boots up. The point being that many IEDs are not even slightly susceptible to malware that is used against conventional laptops, PCs and servers. I am not saying that you couldn’t invest a great deal of time and energy into custom crafting a cyberattack against a digital chart recorder, but is the bang really worth the buck? If you are going to use your one shot at launching a cyberattack on a plant would you squander it making a trend recorder display bad data? The attack would be specific to a given version of hardware and firmware and would become ineffective if firmware were updated. Also rebooting the IED would probably bring it back into normal operation. I have listened-in on endless conversations between I&C personnel (who can’t be blamed for not knowing the detailed technical design specifics of every IED they use) and IT personnel who insist that a computer is a computer is a computer and anti-malware must be provided.

Industrial facilities usually do have a lot of PCs and servers that are components of a DCS, PLC or in-plant SCADA system. They are the same devices used by IT for office automation but in such cases they must be looked at holistically, from a system perspective, and not as individual, stand-alone components. This is not to say that many IT cyber security measures can’t be applied to such systems. The fact is that all of the major automation vendors have discovered that cyber security is a money-making opportunity to sell more goods and especially services to their installed base. Most IACS systems incorporate some level of self-checking and automatic fail-over. There may be a basic underlying set of message traffic between and among system components used for synchronization and updating and for verifying backup availability and triggering automated fail-over and reconfiguration. Not all of that message traffic is well documented (or documented at all) by the system vendor. In another case of IT best practices versus IACS reality the corporate IT representative insisted on using internal firewalls to segment the plant automation system LAN to limit attack penetration and block the spread of malware, not a bad practice and even an ISA.99 recommendation. But the IT consultant insisted that any message traffic that could not be documented as being used for an identified and documented application should be blocked by those firewalls. The plant IACS personnel did not have the technical expertise to argue why this might be a bad idea, and in fact initially everything still seemed to work properly. It was only several months later that it was discovered that engineering changes being made on a supervisory workstation were not making their way down into all of the plant systems automatically as they used to in the past. Rather than trying to fight with the corporate IT folks to get firewall rules changed (since that was apparently on par with taking a case to the Supreme Court) the plant people just revised their work processes and configuration management procedures to make a manual transfer of updates to all of the plant systems that no longer received automatic updates. Another instance where lack of understanding of how IACS technology works ended up wasting time and manpower.

I am just about to celebrate my 64th birthday and I have plans to hang up my spurs in the next year. It would be great to see man finally land on Mars before I die. I am not sure that will happen but I am hopeful. It would be fun to see man back on the moon before I die, even if it is the Chinese who are most likely to make that happen. I am less hopeful about living to see our critical national infrastructure achieve an adequate level of immunity from cyberattack before I die, recalling that it is most likely the Chinese who pose the greatest cyber threat to that national infrastructure.

About the Author

Dr. Shaw is a Certified Information Systems Security Professional (CISSP), a Certified Ethical Hacker (C|EH), a Certified Penetration Tester (CPT) and has been active in designing and installing industrial automation for more than 35 years. He is the author of Computer Control of BATCH Processes and CYBERSECURITY for SCADA Systems and co-author of the latest revision of Industrial Data Communications. Shaw is a prolific writer of papers and articles on a wide range of technical topics and has also contributed to several other books. Dr. Shaw has also developed, and is also an instructor for, a number of ISA courses and he also teaches on-line courses for the University of Kansas continuing education program. He is currently Principal & Senior Consultant for Cyber SECurity Consulting, a consultancy practice focused on industrial automation cyber security and technologies. Inquiries, comments or questions regarding the contents of this column and/or other security-related topics can be emailed to timshaw4@verizon.net