November 22, 2024

Power to the (security) people!
SECURITY SESSIONS

by William T. (Tim) Shaw, PhD, CISSP / CIEH / CPT
There was a television program in recent years that involved taking a group of people and firing questions at them round-robin until one person in the group was designated as “the weakest link” (and called that name by a shrill British woman who seemed to enjoy doing so). My point is not about the poor quality of television programming, but rather the notion that a person in your organization could be the weakest link in your security program. The latest statistics still show that the majority of physical and cyber security breaches are enabled due to the actions, or inactions, of personnel, possibly even personnel with security responsibilities. Let’s consider how and why this continues to happen.

William T. (Tim) Shaw
PhD, CISSP / CIEH / CPT

Today most organizations have begun to accept the fact that both physical and cyber security are essential aspects of their business operations, either because they are regulated (e.g. electric utilities, nuclear power plants) and required to address security in order to avoid fines and sanctions, because they are subject to specific laws (such as HIPPA and SOX) or because they have been ‘shown the light’ by their underwriters who have explained that increased risk exposure means increased insurance rates. Some organizations take security seriously, particularly those that have experienced or witnessed security-breach impacts. Others are still thinking that ‘it can’t happen here’ and just doing the minimum required to avoid fines and legal liabilities. You would think that after learning about Stuxnet, and seeing company after company announce data theft, that the non-believers would see the light. I guess it is like climate change (a.k.a. ‘global warming’) you either believe it is happening or you don’t. And no amount of evidence will change your mind if you think it a hoax.

But I digress (which happens more often as I grow older). Ignoring intentional malicious actions for the moment, why is it that people continue to be the weakest link in most security programs? Most studies and research show that it is because of inadequate training or the total lack of applicable training. A corollary to this is that appropriate training had been provided, but not periodically refreshed. I don’t know about others, but I know that my brain has a small leak and when I fill it full of new information that information immediately starts to drain away and within a year it is as if I had never put anything in there at all. One of the reasons I like to teach is that it affords me a means for constantly refilling my leaky brain.

I have personally witnessed examples of cyber and physical security being badly messed up, in spite of the best efforts and intentions of the individuals involved, because of their lack of applicable training.

One such situation involved the implementation of an intrusion detection system (IDS) which, for those who may not know, is the cyber equivalent of a quality-control inspector that watches product passing down a product line and checks each item to make sure it meets the specifications. An IDS checks every message passing through a communication channel and examines them for malicious content or improper structure and may even evaluate them based on prior related messages that it previously examined. An important part of that last sentence is “every message.” In order for an IDS to be effective, that condition (inspecting EVERY message) must be the case. In the situation I am referencing, the person in charge of implementing the IDS had no formal training or vendor-specific training on the implementation of the IDS product. Moreover, they had limited knowledge of networking and network elements such as Ethernet switches. On the plus side, however, they did have a book about IDS technology and the vendor’s product literature. The individual had decided to use a spare port on a switch to connect the IDS to the network. This switch had 16 ports, all running at 100 MB/second (i.e. 100 Megabytes per second is considered ‘Fast’ Ethernet) and 14 of them were connected to various system components. They configured the switch to replicate all messages going in or out of all of the other ports and sent those replicated messages to the port where the IDS was connected (this is called port-mirroring or, if you are a Cisco user, setting up a SPAN port). That all sounds good, but in reality it can’t work under heavy load conditions. There are 14 ports running at 100 MB/second and only one port (also running at 100 MB/second) into which all that traffic needs to flow to be delivered to the IDS.

If all 14 devices connected to the switch are sending message traffic all at once then some of the replicated messages will be lost, which means that the IDS won’t see everything and can’t be trusted to identify attacks and malicious content such as viruses and worms. The person setting up the IDS ran some simple tests and everything seemed to work. Of course that testing was done ‘off-line’ and not under actual operating conditions on the production systems.

After we discussed the situation, and the bandwidth limitations were explained, the implementation was changed to utilize network ‘taps’ that aggregated all of the traffic. The IDS was equipped with a gigabyte Ethernet adapter so that under full-load conditions no messages would be lost. An equivalent physical-security analogy would be; if you had a video camera monitoring and surveillance system where the cameras stopped working for a few minutes every so often. It was clear that the person setting up the IDS was well intended but had no idea that the setup would not be effective. It was equally clear that if the individual had been allowed to attend IDS vendor training, the installation and configuration would have been done properly in the first place. Does this sound like a case of ‘penny wise and pound foolish’? Two issues are at play here: there was insufficient bandwidth to examine all of the message traffic and the person lacked the expertise to ensure that the rules, signatures, and pattern-matching configuration setup of the IDS was done properly.

Another situation that I see on an all-too-regular basis is the successful use of social engineering methods to gain unauthorized entry into supposedly secure areas, as well as its use to compromise computers and obtain confidential information. This is another case where untrained (or trained too long ago) personnel are the weakest link. Unlike the previous example where I was dealing with people who were involved in implementing cyber security, susceptibility to social engineering ploys tends to be a problem with personnel in general and senior management in particular. Social engineering is basically tricking people into doing something they would not normally do, such as giving out confidential account information like a password. Intruders also use a method called ‘piggy backing’ where they help someone enter an area using their own credentials. Social engineering attacks are best countered by providing all personnel with familiarization training (with periodic refreshers) and having policies and procedures that are aimed at preventing such manipulations from being successful.

Today the most common means for cyber attackers to gain access to company computer systems and networks is through the use of phishing attacks. This is a form of social engineering that uses email as a way for enticing a victim into ‘clicking’ on a link or opening an attached document, which results in their PC being infected. There are technical countermeasures that can be deployed to block such attacks, but the constant arms race between hackers and your IT department is invariably won by the hackers. They are continually finding new exploitable vulnerabilities so that phishing attacks continue to be successful.

Regardless of the vulnerabilities in a browser, or one of its ‘plug-ins’, there is still the need for the victim to ‘click’ on a link or to open an attachment. If personnel are well trained in social engineering ploys and in phishing scams in particular, then the chances of their falling for one are greatly reduced. One organization I know of began social engineering familiarization training, including generating its own phishing scams to test personnel awareness, and recorded a dramatic decrease in PC infection incidents over a two year period.

The bottom line is that skimping on training, both for the personnel who are responsible for your cyber and physical security, and for all personnel as a whole, is probably going to end up costing you more than you would have spent on providing the training.

I started this column by excluding intentional malicious actions on the part of personnel. In reality we don’t have that luxury; the malicious insider threat is one of the most serious. There are things that can be done to limit the amount of damage that can be caused by a malicious insider and things that can be done to reduce their ability to act without being detected … but that will be the subject matter for a future column.

About the Author

Dr. Shaw is a Certified Information Systems Security Professional (CISSP), a Certified Ethical Hacker (C|EH) a Certified Penetration Tester (CPT) and has been active in designing and installing industrial automation for more than 35 years. He is the author of Computer Control of BATCH Processes and CYBERSECURITY for SCADA Systems. Shaw is a prolific writer of papers and articles on a wide range of technical topics and has also contributed to several other books. He has also developed, and is an instructor for, a number of ISA courses. Dr. Shaw is currently Principal & Senior Consultant for Cyber SECurity Consulting, a consultancy practice focused on industrial automation security and technologies. Inquiries, comments or questions regarding the contents of this column and/or other security-related topics can be emailed to timshaw4@verizon.net