April 19, 2024

Security Sessions | Sure It's 'Smart', but Is It Really, Truly All That Smart?

by Dr. Tim Shaw

William T. (Tim) Shaw
PhD, CISSP / CIEH / CPT

Industrial cyber security is challanging for a number of reasons, one being the range of “smart” devices that we commonly use for monitoring, analysis, protection and control. These “smart” devices contain microprocessors and stored programming so they kinda have some of the properties of a ‘computer’. But you can’t protect them in the same way you protect a Microsoft Windows 2008 server. But more to the point, because of their differences you may not even need to protect them all that much. Many of these smart devices are nearly invulnerable to cyber compromise and so attempting to protect them is a waste of time and money.

In the electric power industry, as in other industrial segments, there are pressures (including regulatory, political and economic) pushing you to ensure that your critical infrastructure facilities are protected against both physical attacks and cyber attacks. Putting up fences and gates and hiring guards costs money but you get a clear and obvious return on your investment every time you drive up to the gate and have to prove that you have the right to enter the facility. On the other hand investing in cyber security usually appears to be spending money to protect your plant from something that probably would not have ever happened anyway, and if it’s working well nothing bad actually does happen.

But regardless of how you may personally feel about cyber security there is overwhelming evidence that we are under constant cyber attack by those that wish to steal our technology, gain a competitive advantage, learn our secrets or just rob us via cyber means. An appalling number of domestic industrial facilities have discovered that their systems and networks have been invaded and infested with malicious software, even if such infestations have not caused obvious harm or damage. Such discovery often accompanies their efforts to establish cyber security defenses; they find out that they have already been successfully attacked when they turn-on their shiny new detection mechanisms. So I don’t intend to argue for the need to have adequate cyber security – that should be a given by now. But I do often see misguided efforts to place protections around digital I&C components and smart devices that may not require such protections; usually as a result of trying to apply IT cyber security methods to industrial automation.

One example of this is something I came across recently. At one facility the list of identified important digital assets included a large number of smart transmitters that supported HART protocol rev 5. These transmitters were being used in analog mode and providing a 4-20 mA signal to their respective controller. HART communications was only used (via a hand-held calibrator) for calibration/configuration purposes. The IT folks were very concerned about having malware spread from transmitter to transmitter via a cyber compromised hand-held calibrator. Stop laughing, I am being totally serious. It was pointed out to the IT folks that this was highly unlikely to be a viable attack vector because of several factors: first, the program code in the transmitters themselves was in ROM and not field alterable via cyber means. Only user-alterable settings stored in flash/RAM could be manipulated and improper settings would merely result in an invalid analog value being output. Besides, the calibration process included a settings check by a second tech for any instrument used in a critical or dangerous process. Second, the HART protocol has very specific message types with pre-defined formats and there were no commands that would allow for the access and modification of program code (and any improperly formatted message would be detected and rejected as ‘bad’.) And third, the hand-held calibrators being used also had ROM based program code and could not be field reprogrammed or have their “firmware” field updated.

You would think that this would have settled the matter, but you would have been wrong. The IT folks had apparently been doing a lot of reading about the ‘supply chain threat’ and felt that it might be relevant in this case. For those who may not know this term it refers to actions taken by an adversary, for example, at the vendor/manufacturer’s facility to introduce unauthorized software functionality into the valid program code for a device/system, or to substitute poor quality components. It can also apply to using the trust granted to a vendor’s field service group to gain access to their systems at customer facilities for the purpose of malicious tampering. It can also encompass the possibility of using that same field service/ support channel to provide malicious patches and software updates to customers. The IT folks were concerned that there could be hidden malicious functionality in the program code of the smart transmitters, introduced via the supply chain attack vector, such that at some point, based on some trigger condition, the transmitters would malfunction in a dangerous manner (e.g. swing the analog value up and down between the maximum and minimum calibrated values.) In response it was pointed out that smart transmitters, unlike PCs, do not have time-of-day clocks and in fact do not keep track of time in any manner. Thus a trigger condition could not be time/date based and a number of transmitters would have no way to synchronize their malfunctioning. It was also pointed out that the transmitters had no way of knowing what kind of plant or process in which they were being applied and that manufacturers make such devices in quantity and sell them through stocking distributors, so there is no association between a specific device and where it will eventually end up being used. The point being that an adversary who got into the manufacturer’s software development group could not control where the modified products ended up being used and it is likely that wildly malfunctioning transmitters would be popping up all over the place and not necessarily at the targeted plant. Much the same situation would exist for the hand-held calibrators. The upshot was that those smart transmitters were determined not to require any special cyber security protections.

Of course there are other smart I&C devices that are far more complex than a smart transmitter, and some may need some level of cyber protection. A good example would be a low-end digital trend recorder that supports data dumps to removable media and asynchronous serial communications via an RS-232 link using MODBUS-RTU protocol to enable the fetching of their current input measurement values. Today many of these devices also have field-updatable firmware support via portable media, possibly via a USB port. Many such devices also support optional contact outputs that are activated by crossing user-defined alarm limits on designated inputs. Although such contacts could be used for device control, they are normally intended for triggering external alarm annunciation. To an IT person such a device seems to be ripe for cyber compromise via its numerous, ‘obvious’ attack vectors: it has a bi-directional communication channel, it has one or more interfaces (including USB) for the connection of portable media and it allows for the over-writing and replacement of its programming. And lastly, it could be tricked into ignoring alarm conditions or into creating false alarms.

Putting aside for the moment the value to the attacker of messing up a trend recorder, either to inaccurately display process measurements or to improperly process alarm conditions, the supply chain issue would be pretty much the same for this kind of device as was stated for the smart transmitter example presented previously. The major difference here is that field-alteration of the firmware is possible, as long as a malicious (or manipulated/unwitting) insider can get access to the device and insert specially prepared portable media. Sounds simple right? Actually many such devices store their entire operating program in Flash (non-volatile) memory and copy it to RAM when they are rebooted/turned-on. Firmware updates over-write the entire contents of Flash with an entire new operating program. This is done under user control via an integral menu system (not just because you inserted portable media.) Unlike with a PC where you can add a new program/ task to the running operating system, in this case you must provide complete program code to perform ALL of the normal trend recorder functions and then also include the additional malicious functionality and trigger logic. Because many such devices use custom hardware designs and low-end CPUs (to keep manufacturing costs low) the optimal approach would be to reverse-engineer/disassemble the program code in the Flash memory and then alter it appropriately. This is no small feat of engineering and that brings us back to the question of the value to the attacker of expending such an effort. And we haven’t touched on how that new code would get onto portable media that is then carried into the control room and used to update the recorder. Also note that all such changes could be reversed merely by reloading the prior version of non-altered firmware.

But what about the MODBUS communication link to the trend recorder, doesn’t it need to be protected? If I can tap into that communication link can’t I wreak some havoc, and without having to deal with inserting modified firmware into the device? The answer depends on what functionality is supported by the link. In this case the assumption is that the trend recorder provides a read-only ability to enable some other system to fetch its current input values (probably via a READ MULTIPLE REGISTERS command.) Injecting false message traffic on the communication channel can be done to ‘spoof’ either end of the channel: an attacker can send falsified responses incorporating bad data to a read register command so the ‘other system’ sees bad values. Or the attacker can send messages to the trend recorder.

Of course the trend recorder probably only supports two Modbus commands: read register and read multiple registers. Any other command (or those commands with bad parameters) or malicious message sequence will be rejected by the recorder. Of course the attacker would have to disconnect the recorder and connect (and leaves in place) a computer device that continues to respond to data requests with bad values, something that might be hard to do without someone noticing. And again that brings us back to the question of the value to the attacker of expending such an effort. Mind you, there can be MODBUS communications that offer a serious attack value, particularly if they provide for remote operation and control of plant equipment. But each case needs to be considered for the particular access and impact that could be provided to an attacker who can get onto the communication channel.

I could continue the discussion by working my way up through other smart devices such as protective relays, digital PID controllers, analyzers, smart control elements, etc. But I am hoping that the point has been made. Just because a device is “smart” does not necessarily mean that it is cyber attackable or that the consequences of putting the required effort into a cyber attack to compromise such a device are worth the “bang for the buck.” Each case needs to be looked at in light of the technical and functional capabilities and design of the particular device (or class of device.) But by doing such assessments it should be possible to justify the correct level (if any) of cyber protection provided for your smart devices.

Cyber security for our essential/ critical digital automation assets is important. But cyber security costs time, money and uses personnel resources and thus its application needs to be focused where it does the most good. Attempting to protect things that have little likelihood of being successfully compromised is wasteful and unproductive. It isn’t actually all that hard to develop criteria that help you categorize your digital assets into groups that require no, some or even a lot of cyber security protections so that you can optimize your efforts. But that will have to be the subject matter for a future column.
 

About the Author

Dr. Shaw is a Certified Information Systems Security Professional (CISSP), a Certified Ethical Hacker (C|EH) a Certified Penetration Tester (CPT) and has been active in designing and installing industrial automation for more than 35 years. He is the author of Computer Control of BATCH Processes and CYBERSECURITY for SCADA Systems and is the co-author of the latest revision of Industrial Data Communications. Shaw is a prolific writer of papers and articles on a wide range of technical topics and has also contributed to several other books. Shaw has also developed, and is also an instructor for, a number of ISA courses. He is currently Principal & Senior Consultant for Cyber SECurity Consulting, a consultancy practice focused on industrial automation security and technologies. Inquiries, comments or questions regarding the contents of this column and/or other security-related topics can be emailed to timshaw4@verizon.net.