The InfoSec Blog

Confusion over Physical Assets, Information Assets – Part Two

Posted by Anton Aylward

So I need to compile a list of ALL assets, information or otherwise,

NO!
That leads to tables and chairs and powerbars.

OK so you can't work without those, but that's not what I meant.

InfoAssetsPhysical assets are only relevant in so far as they part of information processing. You should not start from those, you should start from the information and look at how the business processes make use of it.  Don't confuse you DR/BC plan with your core ISMS statements.  ISO Standard 22301 addresses that.

This is, ultimately, about the business processes.

The #1 Reason Leadership Development Fails

Posted by Anton Aylward

http://www.forbes.com/sites/mikemyatt/2012/12/19/the-1-reason-leadership-development-fails/
Training
I wouldn't have though, based on the title, that I'd be blogging about this, but then again one can get fed up with fed up with purely InfoSec blogs, ranting and raving about technology, techniques and ISO27000 and risk and all that.

But this does relate somewhat to security as awareness training, sort of ...

My problem with training per se is that it presumes the need for indoctrination on systems, processes and techniques. Moreover, training assumes that said systems, processes and techniques are the right way to do things. When a trainer refers to something as “best practices” you can with great certitude rest assured that’s not the case. Training focuses on best practices, while development focuses on next practices. Training is often a rote, one directional, one dimensional, one size fits all, authoritarian process that imposes static, outdated information on people. The majority of training takes place within a monologue (lecture/presentation) rather than a dialog. Perhaps worst of all, training usually occurs within a vacuum driven by past experience, not by future needs.

The Decline of the Physical Desktop

Posted by Anton Aylward

http://www.eweek.com/c/a/IT-Management/As-Foretold-by-Desktop-Managment-Tools-588370/

What's interesting here is that this isn't preaching "The Cloud" and only mentions VDI in one paragraph (2 in the one-line expanded version).

Also interesting is the real message: "Microsoft has lost it".

Peter Drucker, the management guru, pointed out that the very last buggy-whip manufacturer in the age of automobiles was very efficient in its processes - it *HAD* to be to have survived that long. (One could say the same about sharks!)

"Keeping desktop systems in good working order is still a labour of Sysiphus .."

Indeed. But LinuxDesktop and Mac/OSX seem to be avoiding most of the problems that plague Microsoft.

A prediction, however.
The problem with DOS/Windows was that the end user was the admin and  could fiddle with everything, including download and install new code. We are moving that self-same problem onto smart-phones and tablets. Android may be based on Linux, but its the same 'end user in control' model that we had with Windows. Its going to be a malware circus.

Enhanced by Zemanta

What drives the RA? Need or Fashion?

Posted by Anton Aylward

A colleague in InfoSec made the following observation:

My point - RA is a nice to have, but it is superfluous. It looks nice
but does NOTHING without the bases being covered. what we need
is a baseline that everyone accepts as necessary (call it the house
odds if you like...)

Most of us in the profession have met the case where a Risk Analysis would be nice to have but is superfluous because the baseline controls that were needed were obvious and 'generally accepted', which makes me wonder why any of us support the fallacy or RA.

It gets back to the thing about the Hollywood effect that is Pen Testing. Quite apart from the many downsides it has from a business POV it is non-logical in the same way that RA is non-logical.

Career Insights from Stephen Northcutt, CEO of SANS

Posted by Anton Aylward

http://www.bankinfosecurity.com/articles.php?art_id=2914

Fascinating.

I get a lot of enquiries from wannabes who, as they put it, want to "break into security". I presume they see it as more interesting than the work they are doing.

They come in all varieties, from high-school kids asking about what degree they should take to people with no actual work experience asking if they should take a CISSP or CISM.

The luminaries of our profession, be they CISSPs or people like Marcus Ranum and Bruce Schneier who lack such certifications, all came up the same way that Stephen Northcut did and many of us here did - the long way. And gained the practical experience and understanding of the issues along the way.

The Need for Social Engineerig in InfoSec

Posted by antonaylward

Communication major dimensions scheme
Image via Wikipedia

When I took my undergraduate Engineering degree the attitude of my professors was that if we had chose engineering as our career then a few things were going on.

First, technology is changing, so teach fundamentals and principles and show how to apply them but don't get hung up on specific technologies. (Who would have guessed then that the RF theory work on transmission  lines would have an impact on writing software for PCB layout and even chip design!)

Second, that if we stayed in engineering, then within three to five years we would have "managerial" responsibilities so we better know about "managerial" things such as budgeting, logistics/supply-chain,
writing proposals and reports.

I mention this to make the point that being a CISSP is not about being a techie-geek. Knowing all there is about crypto, pen testing, or any vendor or product is inherently self limiting. You have put a cap on the authority and influence you have.

To be effective in InfoSec you need to be able to do that "social engineering" - as a recent article says,

"... the application of social science to the solution of social
problems," he said. "In other words, it's getting people to do
what you want by using certain sociological principles."

What you want is for your managers to implement certain strategies that
you believe are for the good of the company and society (see our code of
ethics an associated guidelines). This means you need communication
skills
.

I realise many people reading this are in fact managers, but they too have to
report to higher authorities. Some here have MBAs. Management is more than the technical skill of a MBA course - that's another form of geekiness. (I know of one very good technical guy who saw Dilbert's Principle being applied in his firm an went and got a MBA. The trouble is that he never had any 'people skills' and the MBA course didn't supply them!)

So we get back to a parallel thread - "Trust"'.

Occasionally I run a workshop "Why people don't follow Policies and what you can do about it". Its for technical managers, those who have to enforce many policies, not least of all InfoSec ones, and manage those who are carrying out the associated Procedures. Its always a difficult workshop since its about seeing the patterns in behaviour, something technical managers are quite capable of, but have never been taught before.

Its my belief that InfoSec is meaningless unless it deal with the social and psychological issues. Right now we treat the term "social engineering" the way we do "risk", as something that has *only* a negative meaning. That has to stop. Management don't see "risk" as being bad and as far as threats go, we know that People are the sourceof them all! First and foremost, InfoSec practitioners need to be able to deal with People. Technology is for geeks. If you want to being
about change you have to deal with people.

"Social Engineering" - in the broadest and positive sense - is every bit as key as any other of the domains of the CBK. Its omission just shows how technology-centred the profession is, despite the threats and despite what needs to be done by practitioners to fulfil their roles.

Reblog this post [with Zemanta]