Warning: include_once(/home/antonaylward/InfoSecBlog/public/wp-content/plugins/wordpress-support/wordpress-support.php): failed to open stream: Permission denied in /home/antonaylward/InfoSecBlog/public/wp-settings.php on line 304

Warning: include_once(): Failed opening '/home/antonaylward/InfoSecBlog/public/wp-content/plugins/wordpress-support/wordpress-support.php' for inclusion (include_path='.:/usr/local/lib/php:/usr/local/php5/lib/pear') in /home/antonaylward/InfoSecBlog/public/wp-settings.php on line 304
training « The InfoSec Blog
The InfoSec Blog

The fatal flaw in IT Risk management

Posted by antonaylward

Is interviewing is a much better method that self-certifications and a checklist, if time and resources allow.
Two points:

In the ISO-27001 forum, my friend and colleague Gary Hinson has repeatedly pointed out, and I fully support him in this, that downloading check-lists from the 'Net and adopting question lists from there is using a solution to someone else's
problem. If that.

Each business has both generic problems (governments, sunspots, meteor strikes, floods & other apocalyptic threats and Acts of God) and ones specific to it way of working and configuration. Acts of God are best covered by prayer and insurance.

Gary recommends "open ended questions" during the interview rather than ones that require a yes/no answer. That's good, but I see problems with that. I prefer to ask "Tell me about your job" rather than "Tell me how your job ... can be made more efficient".

My second point is that risk management will *ALWAYS* fail if the risk analysis is inadequate. How much of the RA should be done by interviewing people like the sysadmins I don't know, but I have my doubts. I look to the Challenger Disaster. I started in the aviation business and we refines FMEA - failure Mode Effect Analysis. Some people think of this in terms of "impact", but really its more than that, its also causal analysis. As Les Bell, a friend who is also a pilot and interested in aviation matters has pointed out to me, "Root Cause Analysis" no longer is adequate, failure comes about because of a number of circumstances, and it may not even be a single failure - the 'tree' fans both ways!

Yes, FMEA can't be dome blindly, but failure modes that pertain to the business - which is what really counts -- and the fan-in/out trees can be worked out even without the technical details. Rating the "risk": is what requires the drill-down.

Which gets back to Donn Parker's point in a number of his books, though he never states it this way. The FMEA tree can be heavily pruned using diligence as he says: standards, compliance, contracts, audits, good practices, available products. The only thing he leaves out are Policy and Training. Policy gives direction and is essential to any purpose, the choice of standards and products, and identifying what training is needed.

All in all, the article at https://blog.anitian.com/flawed-it-risk-management/ takes a lot of words to say a few simple concepts.

 

The #1 Reason Leadership Development Fails

Posted by Anton Aylward

http://www.forbes.com/sites/mikemyatt/2012/12/19/the-1-reason-leadership-development-fails/
Training
I wouldn't have though, based on the title, that I'd be blogging about this, but then again one can get fed up with fed up with purely InfoSec blogs, ranting and raving about technology, techniques and ISO27000 and risk and all that.

But this does relate somewhat to security as awareness training, sort of ...

My problem with training per se is that it presumes the need for indoctrination on systems, processes and techniques. Moreover, training assumes that said systems, processes and techniques are the right way to do things. When a trainer refers to something as “best practices” you can with great certitude rest assured that’s not the case. Training focuses on best practices, while development focuses on next practices. Training is often a rote, one directional, one dimensional, one size fits all, authoritarian process that imposes static, outdated information on people. The majority of training takes place within a monologue (lecture/presentation) rather than a dialog. Perhaps worst of all, training usually occurs within a vacuum driven by past experience, not by future needs.

An “11th Domain” book.

Posted by Anton Aylward

http://www.infosectoday.com/Articles/Persuasive_Security_Awareness_Program.htm

Gary Hinson makes the point here that Rebecca Herrold makes elsewhere:   Rebecca Herold
Awareness training is important.

I go slightly further and think that a key part of a security practitioners professional knowledge should be about human psychology and sociology, how behaviour is influenced. I believe we need to know this from two aspects:

First, we need to understand how our principals are influenced by non-technical and non-business matters, the behavioural persuasive techniques used on them (and us) by vendor salesmen and the media. many workers complain that their managers, their executives seem t go off at a tangent, ignore "the facts". We speak of decisions drive by articles
in "glossy airline magazines" and by often distorted cultural myths.  "What Would the Captain Do?", or Hans Solo or Rambo might figure more than "What Would Warren Buffett Do" or "What Does Peter Drucker Say About A Situation Like This?". We can only be thankful that most of the time most managers and executive are more rational than this, but even so ...

Tight budgets no excuse for SMBs’ poor security readiness

Posted by Anton Aylward

http://www.zdnet.com/tight-budgets-no-excuse-for-smbs-poor-security-readiness-2062305005/

From the left hand doesn't know what the right hands is doing department:

Ngair Teow Hin, CEO of SecureAge, noted that smaller companies
tend to be "hard-pressed" to invest or focus on IT-related resources
such as security tools due to the lack of capital. This financial
situation is further worsened by the tightening global and local
economic climates, which has forced SMBs to focus on surviving
above everything else, he added.

Well, lets leave the vested interests of security sales aside for a moment.

Security Operations Center

I read recently an article about the "IT Doesn't matter" thread that basically said part of that case was that staying at the bleeding edge of IT did not give enough of a competitive advantage. Considering that most small (and many large) companies don't fully utilise their resources, don't fully understand the capabilities of the technology they have, don't follow good practices (never mind good security), this is all a moot point.

Social Engineering and sufficency of awareness training

Posted by Anton Aylward

Someone asked:

If you have a good information security awareness amongst
the employees then it should not a problem what kind of attempts
are made by the social engineers and to glean information from
your employees.

Security tokens from RSA Security designed as ...

Yes but as RSA demonstrated, it is a moving target.

You need to have it as a continuous process, educate new hires and educate on new techniques and variations that may be employed by the 'social engineers'. Fight psychology with psychology!

Which Risk Framework to Use: FAIR, FRAP, OCTAVE, SABSA …

Posted by Anton Aylward

What framework would you use to provide for quantitative or qualitative risk analysis at both the micro and macro level?  I'm asking about a true risk assessment framework not merely a checklist.


Yes, this is a bit of a META-Question. But then its Sunday, a day for contemplation.

When does something like these stop being a check-list and become a framework?

COBIT is very clearly a framework, but not for risk analysis and even the section on risk analysis fits in to a business model rather than a technology model.

ISO-27K is arguably more technology (or at least InfoSec) focused that COBIT, but again risk analysis is only part of what its about. ISO-27K calls itself a standard[1] but in reality its a framework.

The message that these two frameworks send about risk analysis is

Context is Everything

(You expected me to say that, didn't you?)

I'm not sure any RA method works at layer 8 or above. We all know that managers can read our reports and recommendations and ignore them. Or perhaps not read them, since being aware of the risk makes them liable.

Ah. Good point.
On LinkedIn there was a thread asking why banks seem to ignore risk analysis .. presumably because their doing so has brought us to the international financial crisis we're in (though I don't think its that simple).

The trouble is that RA is a bit of a 'hypothetical' exercise.

Are *YOU* ready to give up yet?

Posted by Anton Aylward

Apparently (ISC)2 did this survey ... which means they asked the likes of us ....

http://www.darkreading.com/security-monitoring/167901086/security/security-management/229219084/under-growing-pressure-security-pros-may-be-ready-to-crack-study-says.html

Faced with an attack surface that seems to be growing at an overwhelming rate, many security professionals are beginning to wonder whether their jobs are too much for them, according to a study published last week.

Right. If you view this from a technical, bottom-up POV, then yes.

Conducted by Frost & Sullivan, the 2011 (ISC)2 Global Information Security Workforce Study (GISWS) says new threats stemming from mobile devices, the cloud, social networking, and insecure applications have led to "information security professionals being stretched thin, and like a series of small leaks in a dam, the current overworked workforce may be showing signs of strain."

Patching madness, all the hands-on ... Yes I can see that even the octopoid whiz-kids are going to feel like the proverbial one-armed paper-hanger.

Which tells me they are doing it wrong!

Two decades ago a significant part of my job was installing and configuring firewalls and putting in AV. But the only firewall I've touched in the last decade is the one under my desk at home, and that was when I was installing a new desk. Being a Linux user here I don't bother with AV.

"Hands on"? Well yes, I installed a new server on my LAN yesterday.
No, I think I'll scrub it, I don't like Ubuntu after all. I'm putting
in Asterix. That means re-doing my VLAN and the firewall rules.
So yes, I do "hands on".  Sometimes.

At client sites I do proper security work. Configuring firewalls, installing Windows patches, that's no longer "security work". The IT department does that. Its evolved[1] into the job of the network admin and the Windows/host admin. They do the hands-on. We work with the policy and translate that into what has to be done.

Application vulnerabilities ranked as the No. 1 threat to organizations among 72 percent of respondents, while only 20 percent said they are involved in secure software development.

Which illustrates my point.
I can code; many of us came to security via paths that involved being coders, system and network admins. I was a good coder, but as a coder I had little "leverage" to "Get Things Done Right". If I was "involved" in secure software development I would not have as much leverage as I might have if I took a 'hands off' roles and worked with management to set up and environment for producing secure software by the use of  training and orientation, policy, tools, testing and so forth. BTDT.

There simply are not enough of us - and never will be - to make security work "bottom up" the way the US government seems to be trying   We can only succeed "top down", by convincing the board and management that it matters, by building a "culture of security".

Own view of Enterprise Information Security Ar...

One view of Enterprise Information Security Architecure (EISA) Framework.

This is not news. I'm not saying anything new or revolutionary, no matter how many "geeks" I may upset by saying that Policy and Culture and Management matter "more". But if you are one of those people who are overworked, think about this:

Wouldn't your job be easier if the upper echelons of your organizations, the managers, VPs and Directors, were committed to InfoSec, took it seriously, allocated budget and resources, and worked strategically instead of only waking up in response to some incident, and even then just "patching over" instead of doing things properly?

Information Security should be Business Driven, not Technology Driven.

[1] Or devolved, depending on how you look at it.

Related articles

Enhanced by Zemanta

People under extreme stress may behave unpredictably and have limited capacity for rational thought

Posted by Anton Aylward

Les Bell, another ex-pat Brit who lives in Australia was discussing the importance of training and reinforcement in such matters as DR/BCP.  Les is also a pilot and so many of his analogies and examples have to do with piloting and aircraft.

Part of our discussion has a much wider scope.
Les had said:

"People under extreme stress may behave unpredictably and have limited capacity for rational thought"

This is the basis of much of pilot training, particularly in simulators, where procedures that are too dangerous to be attempted in a real aircraft can be repeated until drills are automatic.

Don't quote me on this, but I seem to recall reading in an aviation safety-related article that in an emergency, something like 50% of people lose it to the extent that they are completely unable to cope, 25% are capable of functioning with some degree of impairment, and 25% of people are able to complete required tasks correctly. Training by means of drills and rehearsals is able to correct that situation to a considerable extent.

Therefore in BCP/DRP planning, it's important to - as far as possible - simulate an emergency, rather than just story-boarding it, or doing a whiteboard walkthrough. Hence the requirement for fire drills, evacuation drills and the like; repetition conditions the mind to perform the task correctly under stressful conditions.

Most of us don't get the chance to do a full interruption test for our DRP, but the closer we can get, the better.

Training - drill and reinforcement so that you can carry out the actions automatically even when extreme stress has completely blanked and cognitive functions - is an important part of military "boot camp" training and one reason I find it so comical that CISSP course training gets called "boot camp".

Les is quite right.  For a variety of reasons most people "loose it" under extreme stress.  This is why military heroes, people who can hang in there and think clearly and make critical decisions,  are held in such esteem.   Similarly test pilots (and those test pilots who became the early astronauts).  Having lightening fast reactions (racing drivers) and being in top physical condition helps, but there is something more.

Some authorities look to the old American 'gunslingers' and speculate about how the adrenaline rush in such situations is handled by the body and the brain.   Typically all that adrenaline pumps up the muscles for "fight or flight" and in such panic or near panic situations rationality is not the key issue.  But if we shift from the evolutionary context to the 'gunslinger', standing still means that there is a lot of 'shakes'.  Being able to stay calm and not have the shakes leads to being a sucesfull 'gunslinger'.   Evolution in action?

There are other forms of stress as well.   I've seen sysadmins who have been up for more than 30 hours trying in futile to solve a problem that to me, well rested, is simple and obvious.

The lesson here is two-fold.   The first is the point that Les makes.  Train and reinforce.
The second is that when the disaster does strike be aware that the stress will load up on fatigue and that stressed and fatigued people do not make good decisions.  Rest, shifts, alternates, standard plans and scenarios that can work to relieve the stress are important.