Social Engineering and sufficency of awareness training

Someone asked:

If you have a good information security awareness amongst
the employees then it should not a problem what kind of attempts
are made by the social engineers and to glean information from
your employees.

Security tokens from RSA Security designed as ...

Yes but as RSA demonstrated, it is a moving target.

You need to have it as a continuous process, educate new hires and educate on new techniques and variations that may be employed by the ‘social engineers’. Fight psychology with psychology!

Over and above that I would recommend technical controls. At least one level of mail filtering to block not only the very obvious spam (Viagra, fake watches, pharma, solicitations) and the “Spanish Prisoner” variations but also filter mail with attachments, or at the very lest quarantine it.

Deutsch: Logo des Adobe Portable Document Format

Until recently, many of my clients had a policy that all MS-Office attachments were either blocked or discarded because they could contain embedded executables. For Word and Excel there were plain text equivalents. PDF used to be preferred where applicable. But now PDF
has been subverted.

I realise that there are patches to deal with many of these threats, but the reality is that we are playing catch-up with the Bad Guys even if we do apply the patches the instant they are issued and that they work perfectly and the end users do what they are supposed to and don’t find ways to subvert them.

I used to give a workshop “Why Employees Don’t Follow Policy (and what you can do about it)”. I found that people really don’t think in terms of Policy and are quite willing to argue as to why a particular policy does not or should not apply to them. Realistically, you have to approach this from how people perceive their jobs, what they think they are supposed to be doing to earn their wage.

As has been pointed out many times, if ‘security’ gets in the way of that – the perception, regardless of the reality – then they will try subverting security. To this day I face managers who believe without questioning that security is about saying “NO!” and that security will slow down business and adds cost. Trying to get them to see security as an enabler (“car brakes let you drive faster”) or as a means of focus or as a means of loss avoidance (“well employees shouldn’t be pilfering in the first place…”) is often difficult.

Most people focus on getting their jobs done; t takes a lot of pressure and reinforcement to make them stop and think about the security implications of every action they take. Its us who are “paid to be paranoid“, not them.

Enhanced by Zemanta

About the author

Security Evangelist


  1. Building on “The Art of Deception”, Kevin Mitnick’s latest book – “Ghost in the Wires” – is replete with numerous examples of how someone with sufficient skills, knowledge and cohones can often succeed by social engineering, especially in conjunction with hacking. Of course, he barely mentions the inevitable failed social engineering attempts, and merely hints at the huge effort involved in establishing the background knowledge required to make the attacks succeed (e.g. finding out who to attack, how to contact them, and the lingo needed to make the requests seem legitimate), but one thing does shine through. He is a persistent bugger who treats the entire exercise as a challenge. We white-hats tend to think that if we put a sufficiently strong control in place, we can depend upon it. Some merely see our barriers as speed bumps or something to bypass, undermine or slip past. Defense-in-depth is the traditional way of dealing with this, but do we ever really go far enough? Shouldn’t the quality and breadth of our defense-in-depth measures reflect the risk of failure? Following that train of thought brings a new emphasis to risk assessment, threat modelling and all that jazz.


    PS Having only just finished reading it, I’ll publish a review of Kevin’s book soon, probably at

Leave a Reply