December 22, 2024

But all my friends are doing that!
SECURITY SESSIONS: Volume 8 No. 2

by William T. (Tim) Shaw, PhD, CISSP
Welcome to this installment of Security Sessions, a regular feature focused on security-related issues, policies and procedures. Several cyber security incidents have gotten industry media attention over the past couple of years and have started people thinking about how to protect the ‘less than obvious’ potential avenues of cyber attack. That’s a good thing. But one bad thing I still find when evaluating and planning cyber security for industrial automation systems is the lack of a realistic assessment of the value and impact (particularly negative) of some of the things done to try and protect them. On occasion it seems to me that the cure has in fact made things worse. Blindly applying security controls just because they are called out in some standard or recommended practice may lead to unnecessary expense and an actual reduction in your security posture. – Tim.

William T. (Tim) Shaw
PhD, CISSP

In previous columns I have made the point that blindly applying IT security mechanisms and practices to industrial automation systems is not always (or maybe ever) a good idea. There are many security controls and practices that make very good sense in an IT environment, but which may actually have a negative impact on overall cyber security if applied to a plant automation system and plant environment. But rather than just repeating that simple statement this time I would like to give some prime examples.

The helpful and talented folks at NIST (the National Institute for Science and Technology) have published a long list of recommended cyber security best practices. I consider their “800 series” documents to be mandatory reading for anyone who considers themselves to be a cyber security expert. One of their documents (specifically Special Publication 800-53 “Recommended Security Controls for Federal Information Systems and Organizations” ) is a comprehensive set of recommended technical and administrative security “controls” for application to government IT facilities and IT implementations. I have often heard this document referenced when discussing how to implement cyber security in industrial settings, but I do not believe that was what the authors intended as they are very clear about its ‘IT’ orientation. The authors also encourage making informed decisions about the application of their recommendations. Let me enumerate some of the controls from that document that are clearly questionable in an industrial setting:

AU-5 Response to Audit Processing Failures – This control deals with maintaining system use and access logs and suggests (as one alternative) failing a computer (preferably to a backup, if available) if it loses its ability to continue generating and storing audit records. Loss of auditing helps to hide the actions of an attacker that is tampering with a system so some attackers intentionally shut down auditing functions. This would NOT be the typical recommended course of action with an industrial automation system since maintaining control of the process is paramount.

SC-20 Secure Name/Address Resolution Service – This control deals with making sure that when a user or application needs to convert a computer or domain name into an IP address this process isn’t co-opted by an attacker providing a false address (something called ‘DNS poisoning’). This functionality is mainly needed by users sending email and browsing the Worldwide Web. Users of industrial automation systems really ought not to be doing such things as it implies connectivity with the Internet. In addition, most industrial automation systems are composed of a small number of computers, so it is viable to maintain a fixed pre-defined list of computers (called a “hosts” file) that you need to know about.

SC-17 Public Key Infrastructure Certificates – This control deals with obtaining and verifying digital certificates (“certs”) so that when your computer is making a connection to another computer you can verify its identity by asking a trusted third-party (the supplier of the “cert”) to verify the certificate it offers as identification. This is great when doing on-line shopping, but again it implies a need to make random connections over the Internet – not something that should be happening with an industrial automation system. Someone will probably argue that you can use a PKI architecture within a plant or corporate network, and not have any Internet connectivity involved. This is true, but then you end up with a central server acting as a certificate authority, which can be a potential single point of failure. Several years ago the California ISO used this approach and had all of their digital certificates in all of their equipment spread all around the state expire on the same day at the same time. The results were, needless to say, ugly. The various computers in their systems stopped talking to each other because they were told, by their own central certificate server, that the certificates being offered as identification were no longer valid.

SC-25 Thin Nodes – Managing all applications and databases centrally and only offering users a “remote desktop” makes software license management, malware scanning and software updating/patching much easier. Thus, an IT system manager may greatly prefer a thin-client design. But industrial control systems may not benefit from such a design. Most of the distributed control system (DCS) designs, including those based on PLCs, are in fact “fat client” designs where each workstation (operator or engineering) contains a copy of all of the software needed for the various users. This is done usually as a means of providing fault tolerance and redundancy. Such a design does not have a single point of failure such as a central server where all applications are actually running. This is not to say that some industrial automation vendors have not going to a thin-client design; there are SCADA systems that have that architecture. But it is not always appropriate in every application.

SC-13 Use of Cryptography – This control, as specified in the NIST document, is primarily intended to protect the confidentiality of sensitive/secret information when transmitted over a network or retained in computer storage. There are industrial automation systems that contain confidential information related to trade secrets (e.g., product formulations) and competitive marketing data. But for the most part the information in most industrial automation systems is not worth the effort to encrypt and in fact encryption/decryption could have a negative performance and timing impact. Encryption capabilities don’t tend to exist within the current process controller and PLC product offerings so adding it would be messy and of questionable value. Encryption may well be appropriate for interconnections between independent systems that need to exchange data – for example, between a regional reliability coordinator or ISO and an EMS system – but that is not so much for confidentiality as to ensure that message traffic over an insecure network cannot be “spoofed” (i.e., fake messages sent with bad/malicious data values).

SC-10 Network Disconnect – This control requires that ‘sessions’ (communications between computers) be terminated after a defined period of inactivity. This is so that a user who walks away from their workstation/PC (or goes home for the day) gets logged out automatically, thus preventing a take-over of their still-active workstation by someone for nefarious purposes. In many plant control rooms there are operator workstations that communicate with plant controllers and display constantly-updated plant information, but which do not receive any user input (i.e., mouse movement or keyboard activity) for hours or even days on-end. Having an operator workstation go ‘blank’ and require a user login to reactivate it, because no operator banged on a key or wiggled a mouse in the last 15 minutes would not be considered acceptable in most plants I’ve visited. In fact, it would be seen as dangerous and a safety risk.

I could come up with many more examples from the long list of recommended controls enumerated in that NIST document. Again, I am not saying that there is anything wrong with the controls recommended by NIST in special publication 800-53. Actually, I have high regard for the work done by the people of NIST in this area. I am saying, however – and I think the authors of that document would agree – that those controls were specifically identified as being appropriate and important in an IT environment. You need to consider their purpose and security basis when making the decision to apply them to an industrial control system and in a plant environment.

Every control, be it a physical control (i.e., locked doors and cabinets) a technical control (i.e., a firewall or malware scanning software) or an administrative control (also called operational and management controls; i.e., policies, procedures and training) addresses a potential threat, attack pathway or vulnerability. By understanding these you can make an informed choice about which controls make sense and which don’t. I have done some work recently in trying to write-up the cyber security basis for each of the controls specified in the NIST document. I plan to offer some examples shortly. But that will have to be the subject matter for a future column. Tim.

About the Author

Dr. Shaw is a Certified Information Systems Security Professional (CISSP), a Certified Ethical Hacker (C|EH), a Certified Penetration Tester (CPT) and has been active in industrial automation for more than 35 years. He is the author of Computer Control of BATCH Processes and CYBERSECURITY for SCADA Systems. Shaw is a prolific writer of papers and articles on a wide range of technical topics, has also contributed to several other books and teaches several courses for the ISA. He is currently Principal & Senior Consultant for Cyber SECurity Consulting, a consultancy practice focused on industrial automation security and technologies. Inquiries, comments or questions regarding the contents of this column and/or other security-related topics can be emailed to tim@electricenergyonline.com.