•  
BACK SEND PRINT ARCHIVES

The Bigger Picture | It Must be Working - the LEDs are Blinking!

by Dr. Tim Shaw

In my line of work I get the occasional opportunity to come in and examine the results of plant cyber security implementation programs within various organizations. And one general impression I have come away with is that the quality and effectiveness of those programs is invariably directly proportional to the skill-set and expertise of the team responsible for planning and implementation.

I suppose the response from some of you, to my beginning statement, would be a resounding “well duh!”. I guess it should be obvious that a more experienced and talented team would produce better results. And yet surprisingly often I have witnessed plant cyber security programs that have been staffed in a hap-hazard manner, seemingly by whomever didn’t run away screaming when the topic was raised. All too often the team officially consists, at least on paper, of a large number of personnel and consultants with wide-ranging expertise. But the reality is that most of those people never actually participate or have little time to get involved. And so the actual planning and implementation often falls to a handful of plant engineering staff who may have limited access to corporate IT people if they have questions (and know how ask nicely.) As could be predicted, the net result of this is usually a cyber security program that takes much longer to complete than expected, ends up squandering funds and manpower on stuff that provides no cyber security benefit and leaves the plant’s automation systems still vulnerable to cyber compromise. And even worse, the plant and project team usually think they are properly protected.

In prior articles I have discussed the differences between IT cyber security and industrial automation cyber security and the fact that IT ‘best practices’ do not always translate well (or can downright dangerous) in an industrial automation environment. So it is usually not a good idea to let IT be in charge of planning and executing a plant cyber security program. And yet the IT department may have the only available expertise for things essential for a cyber security program such as Ethernet switches and routers and firewalls. Plant engineering and automation personnel have had to become familiar with some things that used to be the exclusive province of IT such as Ethernet networks, because most I&C gear today comes with an Ethernet port. But there is a difference between being ‘familiar with’ and being an expert or even just being reasonably knowledgeable. I have worked with plant engineers and instrument techs whose understanding of Ethernet switches was limited to: ‘if the little lights are blinking then it must be working’. They had no idea of the various settings that could impact the way the switch worked (including stopping it from working) such as QoS (quality of service) or VLAN parameters. They didn’t know that someone could get into the switch via telnet and mess things up from elsewhere on the network, of which the switch was a part. They didn’t know that you could prevent that from happening by setting a password on the switch.

But lack of sufficient expertise and domain knowledge can be a problem for the IT folks as well. In one such instance, as their major task in a plant cyber security program, the corporate IT group was assigned the responsibility for implementing a network intrusion detection system (NIDS) to monitor message traffic entering and exiting the plant and passing between the various systems and LAN segments within the plant. Such systems look at all messages in real-time and analyze them both individually, and as part of a series of message exchanges, in order to detect malicious message traffic. To do this, depending on the underlying technology being used, often requires the development of ‘rules’ and ‘signatures’ that guide the analysis process. Those rules and signatures are protocol specific and the IT world has extensive experience in building up rules for commonly used protocols like http (web browsing) and ftp (file transfer) and smtp (email). The problem of course is that industrial automation systems and devices make use of a lot of uncommon protocols (at least uncommon in IT circles) such as EtherNET/IP and ProfiNET and Modbus/TCP.

And so in this instance the NIDS was configured with rules and signatures for all the protocols familiar to the IT folks, but in effect it was nearly blind to message traffic to and from and among the automation systems. If you want to send a malicious command to an automation system or maliciously manipulate parameters in such a system or device, you are going to speak to the system or device in a protocol it understands – probably one of those industrial protocols I was just mentioning. So watching for and analyzing that kind of message traffic is rather essential for cyber security in a plant environment. But hey, the LEDs were all blinking, so it must have been working, right?

Sometimes the best of intentions, coupled with lack of sufficient knowledge and inadequate training, can produce results that seem to be successful, at least on the surface, but are actually a failure. In another instance a plant engineer had been advised that a firewall needed to be installed at a certain point in the plant network to comply with some regulatory requirements. The IT department shipped him the firewall and the engineer installed it as directed. To the relief of the engineer, shortly after the firewall was powered-up all of the systems and equipment on both sides of the firewall seemed to return to their normal operation. (And yes, the LEDs were all blinking on the firewall’s front panel.) To the plant engineer this was the definition of success and it was only during a cyber security audit the following year that it was discovered that the only rules in the firewall were the factory defaults of ‘allow everything going in’ and ‘allow everything going out’. (For those not conversant with firewalls ‘rules’ are conditions to check in a message in order to decide to let a message pass through or to block it from passing through the firewall.) So rather than actually functioning as a firewall the device was merely an expensive space-heater with blinking LEDs. The engineer thought that the IT folks had the firewall all setup and configured and the IT folks thought the plant personnel knew about setting up firewalls. A friend of mine used to call that ‘mutual mystification’ and I believe that term describes it nicely. In that case even if the IT department had tried to pre-configure the firewall prior to shipping it to the plant it probably would only have been configured with rules associated with the commonly used IT protocols and not with any for the industrial protocols used at the plant. And even if the plant engineer had realized the need to put rules into the firewall and was well aware of the industrial protocols being used, he did not have the training and skills to develop the necessary rules and configure them into the firewall.

Lack of experience and appropriate expertise can also lead to situations which start as a success but end up as a failure. In another instance a plant had worked with corporate IT to design and build a portable media anti-virus (AV) scanning station and had put in place work procedures that required all portable media (but especially USB ‘thumb drives’) to be AV scanned prior to being used in any plant automation system. Most AV software makes use of unique code/data fragments extracted from captured malware (and generally called ‘signatures’) to identify the presence of such malware on the portable media. The problem is that new and constantly changing malware requires such signatures to be regularly updated in order for the AV scanning function to be effective. When corporate IT setup the scanning station they ensured that the latest and greatest AV signatures were loaded. But they also assumed that the plant personnel would be responsible for keeping the AV signature library updated. It turned out that the one plant engineer who was instructed in how to perform updates left the company a few months later and no one else took over the task because the engineer in question had never trained his replacement or formally documented the procedure. A subsequent cyber security audit discovered that the AV signatures were over a year out of date, which means that for over a year the effectiveness of the AV scanning had been dropping and dropping. But, when media was inserted into the scanning station the LEDs still started blinking, so it was presumed to be working.

So why bring up all these examples? In every one of these cases the people involved were trying to do the right thing and even thought they had done a good job, But, because of a lack of communication and coordination among the various groups supporting the cyber security implementation program, and a lack of applicable technical expertise, the actual results were less than stellar. In each case sited the result was a lack of any cyber security benefit from the labor and monies expended and a false sense of being cyber secure. And it really didn’t have to happen that way. If appropriate personnel resources had been allocated and if necessary expertise had been made available at the right points in the program, even if that required going outside the organization – possibly to a knowledgeable vendor, then all of these examples of failure could have been examples of success. Staffing a team tasked with establishing a plant cyber security program requires management commitment and a recognition of both the diverse skill-sets needed and the amount of time each individual assigned to the team must be prepared to devote to the effort. It is important to consider where your organization may have technical weaknesses in critical areas, and to be prepared to seek-out external resources to shore up those weaknesses. As I said back at the beginning, this stuff is pretty obvious. It may fall into the category of project management 101. And yet time and time again organizations look for false shortcuts and cost savings that will end up costing them more in the long run. I have many more such tales I could relate to you, but the LED on my laptop has stopped blinking and so that will have to be the subject matter for a future column.
 

About the Author

(William) Tim Shaw (PhD, CISSP, C|EH, CPT) has been active in industrial automation for more than 35 years and is the author of Computer Control of BATCH Processes and CYBERSECURITY for SCADA Systems and co-author of Industrial Data Communications. Tim has contributed to several other books, and is a prolific writer of papers and articles on a range of technical topics. Tim has been directly involved in the development of several DCS and SCADA system products and regularly teaches courses for the ISA and the University of Kansas on a range of topics from cyber security to process automation. Inquiries or comments about this column may be directed to Tim at timshaw4@verizon.net.





BACK SEND PRINT
Most consulted news