November 23, 2024

Security Sessions
Volume 2 No. 1

by William T. (Tim) Shaw, PhD, CISSP

Not too long ago, on one of the SCADA cyber security blogs that I routinely monitor, this interesting question was posed: What is the difference between “security” and “compliance”? It’s a relevant question because as of today, most ‘entities’ that are subject to the NERC CIP guidelines need to be in compliance with those requirements, since the schedule of implementation issued by NERC requires them to have achieved “compliance” – and for some, “auditable compliance” – right about now.

In a prior column I provided this dictionary definition of Security: Safety; freedom from worry; protection. By contrast, the definition of Compliance is: The act or process of complying to a desire, demand, proposal or coercion; or alternatively: Conformity in fulfilling official requi­rements. In my experience, I’ve found that compliance is frequently addressed as a mixture of the two definitions. Very often it amounts to a legalistic strategy whereby an organization does just enough to meet official requirements under threat of legal and financial coercion.

In the case of the NERC CIP rules, it appears that some utilities may have taken the approach of doing just enough to claim compliance in order to avoid being hit with huge fines and also to be able to claim due diligence and proper governance in a court of law, should a legal question or challenge ever arise. The salient point here is that security and compliance are very definitely not synonymous.

One would hope that well-intended official requirements such as the NERC CIP rules would – if properly addressed and implemented – lead to being secure. However, that may not be the case. Without a doubt, the underlying objectives of the NERC recommendations – starting with the original NERC-1200 and NERC-1300 standards and now, the CIP rules, have all sought to provide guidance and to suggest best practices for establi­shing acceptable levels of security for critical infrastructure assets.

Among the most often voiced concerns about the CIP guidance is that it leaves far too much to the unilateral discretion of the target entities. There are several areas where the rules get highly specific, such as requiring port scanning. But others – such as the selection of an assessment methodology – leave the interpretation of the rules (as one of my college professors used to say) “to the reader as an exercise.” This has created a firestorm of arguments and articles about what NERC could/should do to improve the CIP guidance.

Realistically, a tightly specified ‘one-size-fits-all’ approach would have been far worse than what we have, and I can imagine that it would have generated just as much, if not more, controversy and debate. I have always interpreted the NERC approach as allowing the various entities to have the flexibility needed to adjust their compliance response, within a limited range of varia­bility, for their particular situations. But the question remains; if you comply are you also secure? If all an organization has as their objective is the minimum required neces­sary to achieve legal compliance, then the odds are good that their actual security level won’t be all that great.

One of the reasons many organizations fall short of achieving a (legitimately) secure environment is that effective security requires a management commit­ment and ongoing management and employee support.

If employees sense that management is treating security as just another bureau­cratic process to be endured, they won’t ever take it seriously. And that is a very big mistake – one that can actually expand vulnerability rather than diminish it.

Many of the technical and physical mechanisms (i.e., security countermeasures) put in place for the purpose of cyber/electronic and/or physical security can be inadvertently – or in some cases, deli­berately – neutralized by the imprudent actions of an employee. Passwords are a good example. That is, if employees think security isn’t important, they won’t worry about following the IT department’s suggestion of picking hard-to-guess passwords, mainly because that usually makes passwords harder to remember. And, writing them down and leaving the password in obvious places – like on a sticky note slapped onto a PC screen – is another no-no.

Likewise, if no one really worries about security then why bother making doubly sure that doors are locked or that sensitive information is properly disposed of, not to mention obsolete company computer equipment? Remember, poorly trained, uninformed or unmotivated employees can easily neutralize all of that money spent on the effective security hardware, software and procedures. Bottom line: A password like “password” is no protection – nor is a lock that isn’t locked!

One of the reasons that most of the individual CIP standards specify that a senior manager (or delegate) must “review and approve” security policy is to elevate the visibility of security to a management level and make at least the designated senior manager accountable for security. Security is not a one-shot process. Whatever you put in place last year may not be adequate next year, based on changes in technology, changes in your business objectives, or changes in the security “threatscape” as it is sometimes called.

So, in order to remain secure you need an evolving security management pro­gram; one that has committed staffing, a firm budget and management priority. If security concerns and objec­tives are not treated as one of the annual business processes and emplo­yees are not perio­dically reminded to consider security as part of their daily job requirement, security effectiveness will suffer. On the other hand, a well-trained and motivated employee base can significantly improve overall secu­rity, quite possibly more than any other factor or strategy.

On another front, we often make the mistake of thinking that cyber attacks on computer systems depend solely electronic communications vulnera­bili­ties. While that is certainly one dimen­sion of cyber attacks, it is usually the final step – not the first one. In prepara­tion for an attack, the perpetrator(s) will usu­ally have already spent considerable time and effort gathering information for the ultimate attack using a range of reconnaissance methods.

Devastating and well-publicized cyber attacks
against commercial firms often have been successful because the attackers used so-called “social engineering” techniques to gather vital information prior to launching the electronic attack. Stealing or buying a company computer, going through the company dumpster, calling in to the various departments, tricking people into giving out passwords and even physically entering the company facilities are all tactics that have been successfully used to gather information that enables successful attacks. It is likely that someone planning to attack a SCADA system will learn from those successful commercial attacks and try the same strategies. However, a well thought out security policy that provides clear guidance and specific rules for employee behavior can help thwart these common data gathering methods.

Operational security is the general term and category most often used to encom­pass things like employee training and motivation, use of background checks and the administration and enforcement of company policies and procedures. They are also sometimes referred to as Administrative Countermeasures. Opera­tional security is but one of the three main aspects of overall security, along with physical and cyber/electronic security. It is a necessary and vital component of a security program and at the forefront of several of the NERC CIP rules. But, more on that in a future column…

About the Author
William T. “Tim” Shaw (PhD, CISSP) has been active in industrial automation for more than 30 years and is the author of Computer Control of BATCH Processes and CYBERSECURITY for SCADA Systems. Tim has contributed to several other books and is a prolific writer and presenter on a range of technical topics. He is currently a senior security consultant for SecuriCon, an information security solutions firm, based in Alexandria, Virginia. Tim has been directly involved in the development of several DCS and SCADA system products and regularly teaches courses for ISA (International Society of Automation) on various topics. Inquiries or comments about this column may be directed to Tim at Tim@electricenergyonline.com.