Learning to Counter Threats – Skills or Ethics?

Fellow CISSP  Cragin Shelton made this very pertinent observation and gave me permission to quote him.

The long thread about the appropriateness of learning how to lie (con, `social engineer,’ etc.) by practising lying (conning, `social engineering’, etc.) is logically identical to innumerable arguments about whether “good guys” (e.g. cops and security folk) should teach, learn, and practice

  •  writing viruses,
  •  picking locks,
  •   penetrating firewall-protected networks,
  •  cracking safes,
  •  initiating and exploiting buffer overflows, or
  •  engaging in any other practice that is useful to and used by the bad guys.

We can’t build defenses unless we fully understand the offenses. University professors teaching how to write viruses have had to explain this problem over and over.

Declaring that learning such techniques is a priori a breach of ethics is short-sighted. This discussion should not be about whether white hats should learn by doing. It should be about how to design and carry out responsible learning experiences and exercises. It should be about developing and promoting the culture of responsible, ethical practice. We need to know why, when, how, and who should learn these skills.

We must not pretend that preventing our white hatted, good guy, ethical, patriotic, well-intentioned protégés from learning these skills will somehow ensure that the unethical, immoral, low breed, teen-vandal, criminal, terrorist crowds will eschew such knowledge.

I have grave reservations about teaching such subjects.
It’s not that I disagree with Craigin in that practitioners need to know about such things – well OK, perhaps not all know about every one of them in depth, we’re not meant to be encyclopaedic individuals, we do specialize, but it comes down to Ethics.

The problem is that the Dark Side and the White Side are like a Möbius strip. and many who are attracted to learn about these skills may not be of the strictest moral code. its too easy for them start off with the best of intentions, attracted by the sheer geek and technology, and one way or another end up Greyer and Greyer, every step of the way being quite reasonable.  Novels have been written about such people and such paths.  It’s not about Evil or anything dramatic.

That’s what makes it so frightening for me.  I firmly believe that Craigin is right, we do need to be aware of these skills.  Someone has to show that voting machines can be subverted, that hotel doors are not secure, that the TSA’s security measures are all just ‘theatre’.   And it has to be The Good Guys that do this, and they have to have the freedom to do this research without being hassled by the DMCA, patent lawyers, outraged vendors or politicians.  Not, for that matter, should they need special licence.  History has shown many great discoveries come about by accident or by ‘looking sideways’ at some other problem.

Pin and tumbler lock picking

As it is, many of the security groups I subscribe to on LinkedIn regularly have people who are wanting to “break in to security” – yes that’s the term they use – ‘break in‘.  Many make it clear that they want the be  a ‘1337 H4x0r‘ and display their naivety by asking what certification they should take to get that skill, a CISSP or CISA or a Security+.   Really, though, what’s their motivation?  If classes on lock-picking and writing viruses are open, as are the classes on Ethics and Professional Practice, and just as optional which do you think are going to be more attractive?

Oh right, yes, which is harder to teach and which is harder to test for.
There is that too.

Enhanced by Zemanta

About the author

Security Evangelist


  1. >> Declaring that learning such techniques is a priori a breach of ethics is short-sighted.

    I used to be a professor, so I’m biased in favor of public disclosure and widespread education. But it is far more than short-sighted. It is the same slippery slope that offering knowledge risks. It is the aspect of the dark side that leads to blocking the Hugo awards ( https://www.ustream.tv/blog/2012/09/03/hugo-awards-an-apology-and-explanation/ ).

    I would also like to point out that in grad school I had a friend who had a portion of his PhD thesis classified, preventing him from publishing. (This was in the early days of public key crypto, and this wasn’t an exploit, just insight into how things worked.)

    Certain kinds of training *are* harmful. Certain kinds of tools to protect us from ourselves have unintended consequences which can be dire.

    This is not easy.

Leave a Reply