Cannataci also argued forcefully that mass surveillance was not the way to
handle the threat from terrorism and pointed to a report by the Dutch
intelligence services that argues that point. “To get real terrorists, you have
to go for good old-fashioned infiltration,” he argued, wishing that the security
services would spend less money on computers and more on real people who go out
and get real, actionable intelligence on what people are up to. “It’s time to be
realistic and actually examine what evidence shows.”
Where have I heard that before?
If you think technology can solve your security problems, then you don’t understand the problems and you don’t understand the technology — Bruce Schneier
Essentially what he’s saying is summed up by another Schneier quote:
People often represent the weakest link in the security chain and are chronically responsible for the failure of security systems — Bruce Schneier, Secrets and Lies
My Friend Alan Rocker and I often discuss ideas about technology and tradeoffs. Alan asked about SSDs for Linux:
> I haven’t been following hardware developments very closely for a while, so I
> find it hard to judge the arguments. What’s important?
Ultimately what’s important is the management software, the layer above the drivers, off to one side. That applies regardless of the media and means that the view the applications take of storage is preserved regardless of changes in the physical media.
> The first question is, what areas are currently the bottlenecks and
> constraints, at what orders of magnitude?
In theory, consumers and businesses could punish Symantec for these
oversights by contracting with other security vendors. In practice, there’s
no guarantee that products from other vendors are well-secured, either
— and there is no clearway to determine how secure a given security
product actually is.
Too many firms take an “appliance” or “product” (aka ‘technology”) approach to security. There’s a saying that’s been attributed to many security specialists over the years but is quite true:
If you think technology can solve your security problems,
then you don’t understand the problems and you don’t
understand the technology.
Cyber risk should not be managed separately from enterprise or business risk. Cyber may be only one of several sources of risk to a new initiative, and the total risk to that initiative needs to be understood.
Is interviewing is a much better method that self-certifications and a checklist, if time and resources allow.
In the ISO-27001 forum, my friend and colleague Gary Hinson has repeatedly pointed out, and I fully support him in this, that downloading check-lists from the ‘Net and adopting question lists from there is using a solution to someone else’s
problem. If that.
Each business has both generic problems (governments, sunspots, meteor strikes, floods & other apocalyptic threats and Acts of God) and ones specific to it way of working and configuration. Acts of God are best covered by prayer and insurance.
Gary recommends “open ended questions” during the interview rather than ones that require a yes/no answer. That’s good, but I see problems with that. I prefer to ask “Tell me about your job” rather than “Tell me how your job … can be made more efficient”.
My second point is that risk management will *ALWAYS* fail if the risk analysis is inadequate. How much of the RA should be done by interviewing people like the sysadmins I don’t know, but I have my doubts. I look to the Challenger Disaster. I started in the aviation business and we refines FMEA – failure Mode Effect Analysis. Some people think of this in terms of “impact”, but really its more than that, its also causal analysis. As Les Bell, a friend who is also a pilot and interested in aviation matters has pointed out to me, “Root Cause Analysis” no longer is adequate, failure comes about because of a number of circumstances, and it may not even be a single failure – the ‘tree’ fans both ways!
Yes, FMEA can’t be dome blindly, but failure modes that pertain to the business – which is what really counts — and the fan-in/out trees can be worked out even without the technical details. Rating the “risk”: is what requires the drill-down.
Which gets back to Donn Parker‘s point in a number of his books, though he never states it this way. The FMEA tree can be heavily pruned using diligence as he says: standards, compliance, contracts, audits, good practices, available products. The only thing he leaves out are Policy and Training. Policy gives direction and is essential to any purpose, the choice of standards and products, and identifying what training is needed.
What exactly do you mean by “cyber security”? Or “cyber” for that matter. Please explain.
It seems to be one of those Humpty-dumpty words that the media, the government and others use with — what’s the current politically correct phrase to use now when one would, 50 years ago have said ‘gay abandon’? — because its current;y “in”?
I see it used to mean “computer” and “network” in the specific and “computers” and “networks” in the general, as well as specific functions such as e-banking, & other e-commerce, “Big Data”, SCADA, POTS and its replacements.
I see it used in place of “Information” in contexts like “information Security” becoming, as above, “Cyber Security“. But you don’t know that it means that.
Are we here to protect the data? Or just the network? or just the computer?
Until a few years ago “Cyber” still did mean “steersman”, even if that was automated rather than a human presence. No-one would call the POTUS a “Cyber-man’ in the sense of being a steersman for the republic.
Perhaps we should start a movement to ban the use of “Cyber-” from use by the media.
Perhaps we might try to get some establishments to stop abusing the term.
I doubt very much we could do that with media such as SCMagazine but perhaps we could get the Estate of the Late Norbert Weiner to threaten some high profile entities like the State Department for the mis-use of the term?
In my very first job we were told, repeatedly told, to document everything and keep our personal journals up to date. Not just with what we did but the reasoning behind those decisions. This was so that if anything happened to use kn knowledge about the work, the project, what had been tried and thought about was lost, if, perhaps, we were ‘hit by a bus on the way to work‘.
At that point whoever was saying this looked toward a certain office or certain place in the parking lot. One of the Project managers drove a VW bus and was most definitely not a good driver!
So the phrase ‘document everything in case you’re hit by a bus’ entered into the work culture, even after that individual had left.
And for the rest of us it entered into our person culture and practices.
Oh, and the WHY is very important. How often have you looked at something that seems strange and worried about changing it in case there was some special reason for it being like that which you did no know of?
Unless things get documented …. Heck a well meaning ‘kid’ might ‘clean it out’ ignorant of the special reason it was like that!
So here we have what appear to be undocumented controls.
Perhaps they are just controls that were added and someone forgot to mention; perhaps the paperwork for these ‘exceptions’ is filed somewhere else or is referred to by the easily overlooked footnote or mentioned in the missing appendix.
It has been pointed out to me that having to document everything, including the reasons for taking one decision rather than another, “slows down work”. Well that’s been said of security, too, hasn’t it? I’ve had this requirement referred to in various unsavoury terms and had those terms associated with me personally for insisting on them. I’ve had people ‘caught out’, doing one thing and saying another.
But I’ve also had the documentation saving mistakes and rework.
These days with electronic tools, smartphones, tablets, networking, and things like wikis as shared searchable resources, its a lot easier.
Sadly I still find places where key documents such as the Policy Manuals and more are really still “3-ring binder” state of the art, PDF files in some obscure location that don’t have any mechanism for commenting or feedback or ways they can be updated.
Up to date and accurate documentation is always a good practice!
 And what surpises me is that when I’ve implemented those I get a ‘deer in the headlight’ reaction from staff an managers much younger than myself. Don’t believe what you read about ‘millennials’ being better able to deal with e-tools than us Greybeards.
I'm not sure whether to quote "Up the Organisation", ("If you must have a
policy manual, reprint the Ten Commandments"), or "Catch-22" (about the
nice "tidy bomb pattern" that unfortunately failed to hit the target), in
support of the article.
Industry-wide metrics can nevertheless be useful, though it's fatal to
confuse a speedometer and a motor.
However not everyone in the group agreed with our skepticism and the observations of the author of the article.
And Anton aren’t the controls you advocate so passionately best practices? >
NOT. Make that *N*O*T*!*!*! Even allowing for the lowercase!
But this does relate somewhat to security as awareness training, sort of …
My problem with training per se is that it presumes the need for indoctrination on systems, processes and techniques. Moreover, training assumes that said systems, processes and techniques are the right way to do things. When a trainer refers to something as “best practices” you can with great certitude rest assured that’s not the case. Training focuses on best practices, while development focuses on next practices. Training is often a rote, one directional, one dimensional, one size fits all, authoritarian process that imposes static, outdated information on people. The majority of training takes place within a monologue (lecture/presentation) rather than a dialog. Perhaps worst of all, training usually occurs within a vacuum driven by past experience, not by future needs. Continue reading The #1 Reason Leadership Development Fails
I go slightly further and think that a key part of a security practitioners professional knowledge should be about human psychology and sociology, how behaviour is influenced. I believe we need to know this from two aspects:
First, we need to understand how our principals are influenced by non-technical and non-business matters, the behavioural persuasive techniques used on them (and us) by vendor salesmen and the media. many workers complain that their managers, their executives seem t go off at a tangent, ignore “the facts”. We speak of decisions drive by articles
in “glossy airline magazines” and by often distorted cultural myths. “What Would the Captain Do?”, or Hans Solo or Rambo might figure more than “What Would Warren Buffett Do” or “What Does Peter Drucker Say About A Situation Like This?”. We can only be thankful that most of the time most managers and executive are more rational than this, but even so … Continue reading An “11th Domain” book.
Fellow CISSP Cragin Shelton made this very pertinent observation and gave me permission to quote him.
The long thread about the appropriateness of learning how to lie (con, `social engineer,’ etc.) by practising lying (conning, `social engineering’, etc.) is logically identical to innumerable arguments about whether “good guys” (e.g. cops and security folk) should teach, learn, and practice
engaging in any other practice that is useful to and used by the bad guys.
We can’t build defenses unless we fully understand the offenses. University professors teaching how to write viruses have had to explain this problem over and over.
Declaring that learning such techniques is a priori a breach of ethics is short-sighted. This discussion should not be about whether white hats should learn by doing. It should be about how to design and carry out responsible learning experiences and exercises. It should be about developing and promoting the culture of responsible, ethical practice. We need to know why, when, how, and who should learn these skills.
We must not pretend that preventing our white hatted, good guy, ethical, patriotic, well-intentioned protégés from learning these skills will somehow ensure that the unethical, immoral, low breed, teen-vandal, criminal, terrorist crowds will eschew such knowledge.
From the left hand doesn’t know what the right hands is doing department:
Ngair Teow Hin, CEO of SecureAge, noted that smaller companies
tend to be “hard-pressed” to invest or focus on IT-related resources
such as security tools due to the lack of capital. This financial
situation is further worsened by the tightening global and local
economic climates, which has forced SMBs to focus on surviving
above everything else, he added.
Well, lets leave the vested interests of security sales aside for a moment.
I read recently an article about the “IT Doesn’t matter” thread that basically said part of that case was that staying at the bleeding edge of IT did not give enough of a competitive advantage. Considering that most small (and many large) companies don’t fully utilise their resources, don’t fully understand the capabilities of the technology they have, don’t follow good practices (never mind good security), this is all a moot point. Continue reading Tight budgets no excuse for SMBs’ poor security readiness
If you have a good information security awareness amongst
the employees then it should not a problem what kind of attempts
are made by the social engineers and to glean information from
Yes but as RSA demonstrated, it is a moving target.
And this doesn’t actually stop them form making use of ‘insider information’ they just have to declare it within 30 days.
No, wait, sorry … you mean that the legislators are saying that legislators shouldn’t do something that is illegal anyway? Or that, if they do something that is already illegal, it is OK as long as they declare it within 30 days? …
I’d like to claim the system is rigged so ‘the rich get richer’ but if I did that some people who claim they are right wing would accuse me of being left wing. Indeed, this tells me that their political outlook has not progressed since 20 June 1789. This one-dimensional view fails to describe the rich variety of political attitudes in the Washington, never mind the rest of the USA and points elsewhere on the physical compass.
Just those two show we need more that 4 axes to describe a political stance. But as I mentioned in a previous post, journalists are simple-minded and expect the rest of the world to be as limited in outlook and understanding.
How does this all relate to InfoSec, you ask.
Well part of that Political Compass is a view of ‘how authoritarian’.
And that gets back to issues we have to deal with such as Policy and Enforcement, Do We Let Employees have Access to the Internet, and the like.
Hans Eysenk pointed out that the right wing (e.g. Fascism and Nazism) had a lot in common with the left wing (communism). Both are repressive, undemocratic and anti-Semitic. So on these issues, at least, the left-right distinction is meaningless.
How many more such simplistic distinctions such as those foisted on us by journalists are equally meaningless.
Some while ago my Australian fellow ex-pat Les Bell, who apart from being a CISSP is also a pilot, pointed out to me that the method of ‘root cause analysis‘ is no longer used in analysing plane crashes. The reality is that “its not just one thing”, its many factors. We all know that applies in most areas of life.
I suspect most people know that too; its not restricted to the digerati.
There is the old ditty that explains how because of a nail an empire was lost, but no-one is proposing that we fix the failing of the “American Empire” by manufacturing more nails.
BUT his was hectoring us and telling us that the Devil is out there gathering sinners (aka botnets) and tempting us (with web sites and spam) and just watch what he says: we must open our hearts to Christ (aka his company’s products) and be SAVED by following the One True Faith (only buying his company’s products) and repenting for our sins (having is company come in and do all the scans, consulting and so forth).
I was inoculated against the religious hectoring meme at a young age, but its still fascinating to watch. But like with religion, there are always people who are susceptible, and sadly, always groups willing to give such people a platform.
To be fair, that day’s event also had some good speakers. It had some straight forward and ‘humble’ people who explained matters clearly and without drama, stated the issues and the scopes of threats and
vulnerabilities and how and why their product id what it did. All without the drama, all without the hectoring or intimidation.
You gotta love the low-tech solution. It’s really never NOT about people, is it? 🙂
Darn tooting right!
Its always people. Any way you look at it.
Which is why I go on about The 11th Domain.
Why the CBK places so much emphasis on technology when the (ISC)2’s motto is “Security transends technology” and why the “people” aspect, social structures of organizations, behavioural psychology, group psychology and lot more, all of which are “about people” and probably have a greater leverage as far as InfoSec “Getting Things Done” (Especially in a stress-free manner_.
As I said previously, I think we’re doing it wrong; and I don’t mean just Risk Assessment!
Faced with an attack surface that seems to be growing at an overwhelmingrate, many security professionals are beginning to wonder whether their jobs are too much for them, according to a study published last week.
Right. If you view this from a technical, bottom-up POV, then yes.
Conducted by Frost & Sullivan, the 2011 (ISC)2 Global InformationSecurity Workforce Study (GISWS) says new threats stemming from mobile devices, the cloud, social networking, and insecure applications have led to “information security professionals being stretched thin, andlike a series of small leaks in a dam, the current overworked workforce may be showing signs of strain.”
Patching madness, all the hands-on … Yes I can see that even the octopoid whiz-kids are going to feel like the proverbial one-armed paper-hanger.
Which tells me they are doing it wrong!
Two decades ago a significant part of my job was installing and configuring firewalls and putting in AV. But the only firewall I’ve touched in the last decade is the one under my desk at home, and that was when I was installing a new desk. Being a Linux user here I don’t bother with AV.
“Hands on”? Well yes, I installed a new server on my LAN yesterday.
No, I think I’ll scrub it, I don’t like Ubuntu after all. I’m putting
in Asterix. That means re-doing my VLAN and the firewall rules.
So yes, I do “hands on”. Sometimes.
At client sites I do proper security work. Configuring firewalls, installing Windows patches, that’s no longer “security work”. The IT department does that. Its evolved into the job of the network admin and the Windows/host admin. They do the hands-on. We work with the policy and translate that into what has to be done.
Application vulnerabilities ranked as the No. 1 threat to organizationsamong 72 percent of respondents, while only 20 percent said they are involved in secure software development.
Which illustrates my point.
I can code; many of us came to security via paths that involved being coders, system and network admins. I was a good coder, but as a coder I had little “leverage” to “Get Things Done Right”. If I was “involved” in secure software development I would not have as much leverage as I might have if I took a ‘hands off’ roles and worked with management to set up and environment for producing secure software by the use of training and orientation, policy, tools, testing and so forth. BTDT.
There simply are not enough of us – and never will be – to make security work “bottom up” the way the US government seems to be trying We can only succeed “top down”, by convincing the board and management that it matters, by building a “culture of security”.
This is not news. I’m not saying anything new or revolutionary, no matter how many “geeks” I may upset by saying that Policy and Culture and Management matter “more”. But if you are one of those people who are overworked, think about this:
Wouldn’t your job be easier if the upper echelons of your organizations, the managers, VPs and Directors, were committed to InfoSec, took it seriously, allocated budget and resources, and worked strategically instead of only waking up in response to some incident, and even then just “patching over” instead of doing things properly?
Information Security should be Business Driven, not Technology Driven.
Just in the last 15 years, since microwave technology aboard satellites
produced images of water vapor in the atmosphere, scientists have come
to realize that most major winter rainstorms over California, and
virtually all flooding episodes, are the result of the unloading of
airborne streams of tropical moisture that have come to be called
“Atmospheric Rivers.” (Hence the name, ARk – Atmospheric Rivers 1,000.)
The scenario envisions nearly a month of uninterrupted rainfall over
northern and southern California.
“The hypothetical storm depicted here would strike the U.S. West Coast
and be similar to the intense California winter storms of 1861 and 1862
that left the central valley of California impassible,” the authors
said. “The storm is estimated to produce precipitation that in many
places exceeds levels only experienced on average once every 500 to
In addition to property and “business interruption” losses of anywhere
from $725 billion to $1 trillion, the team estimated that emergency
managers would be faced with the task of evacuating 1.5 million people
during the storm and its aftermath. “The numbers that have been
presented here are shocking, no doubt about it,” observed co-author
Laurie Johnson, a private planning specialist who worked on Katrina
Hurricane recovery. Such a storm could pose “a fiscal crisis that will
cascade through every level of government.”
All that is says is that 1,000 years storms exist, and can occur. The only thing new here is they understand more about the mechanisms of these 1,000 years storms when they do happen, not that one is imminent.
I’ve got some more news for you: one day, the sun will Red Giant and engulf the entire Earth. The damages will exceed a trillion dollars. The probability of this is 1.0 …. in astronomical time-scales.
The logic or risk analysis that equates a once in five billion years event that has an impact of trillions of dollars with monthly events that cost hundreds of dollars is lunacy.
There are many inconvenient events that do occur on a monthly basis [again with probability 1.0] that cost hundred, even thousands of dollars, and we ‘just live with them’. If you doubt that statement look at the incidents of automobile deaths and injuries and of deaths and disabilities due to pollution. I’m sure any insurance company or government statistics office will be happy to supply you with the details.
One thing is very clear: we are not good at recognizing where the real threats and risks are.