On Uncertain Security

by alex on April 3, 2010

One of the reasons I like climate studies is because the world of the climate scientist is not dissimilar to ours.  Their data is frought with uncertainty, it has gaps, and it might be kind of important (regardless of your stance of anthropomorphic global warming, I think we can all agree that when the climate changes, crazy things can happen).

Recently, the mainstream press has begun to pick up on this, and trying to explain what science is doing.  One such example is this Times (UK) story called

Scientists Need The Guts To Say, “I Don’t Know”

In it, the author (David Spiegelhalter – Professor of the Public Understanding of Risk at the University of Cambridge) discusses uncertainty in past (and forward) looking predictions.  Yes, it’s worth noting that the science of prediction applies to all three states of time: past, present, and future.

As a security professional, I always encourage the representation of uncertainty.  Depending on the audience, I’ll represent uncertainty technically, or at a high level with words like “back of the napkin, very rough, a lot of unknowns, fairly certain, pretty good idea…”  I’ve found that as long as they are properly qualified, demonstrations  of risk with high degrees of uncertainty are not unuseful.


They really *are* two great tastes that taste great together….

One of the great reasons for the IT Risk management/Security team to communicate uncertainty (esp. to others with money) is that if you say “here’s what we think but we’re not sure “,  you can then tell the business owner “and if you give me $funding we can decrease that uncertainty by gaining visibility into $whatever”.  If they decline, they’re accepting both the risk and the probability that you’re wrong.  But if they’re uncomfortable with the uncertainty, now you have a pretty good qualitative way of knowing that their tolerance for this level of risk is pretty low, and you might even be able to skip right past the “buy more visibility” step above and move right into “of course, we can just spend $Y and take care of the whole thing, visibility, risk reduction and all….”

Similarly, if you, the security manager, keep getting risk analyses back that have significant uncertainty in them – you know that these are areas where you really don’t have much control.  They may represent reasons or opportunities to strengthen policies, processes, capabilities (w00t everybody goes to training in Cancun!) and so forth.

So while it’s also the enemy of accuracy, uncertainty can also be your friend.

One last note, having to do with uncertainty; in the article the author uses the Taleb definition of “Black Swan”.  Again, calling a rare event a “Black Swan” is a misnomer.  Rarity in frequency is only one aspect of what the concept of Black Swan represents.  A much better definition of a Black Swan is “an occurance which is not representable at all given our prior distributions.  Certainly, even after before Prof. Spiegelhalter corrected the model for double yoked eggs – the occurance of 6 is not a true Black Swan.  We could have run MCMC sims until our computers melted into hot lumps of toxic waste and various occurrences of double yoked eggs would/could have been represented.


[…] Uncertain Security (#infosecblog) April 3, 2010 — Tom Olzak On Uncertain Security is an excellent blog post about the uncertainty principle in risk […]

by On Uncertain Security (#infosecblog) « Tom Olzak on Security on April 3, 2010 at 7:25 pm. Reply #

Excellent post. In case there are any nay-sayers, I’ve used this in practice to scale back large initiatives until we collected more evidence on the cost/benefit. E.g. audit says we need a comprehensive data inventory and data loss prevention solution, or security needs a complete network monitoring, detection, and forensics solution. Instead of committing millions and reprioritizing resources we didn’t have, we committed to pilots to collect evidence and form control strategies, saving loads of time and not letting the audit or FUD tail wag the dog. I’m certain of it 🙂

by jared pfost on April 4, 2010 at 11:24 pm. Reply #

Yes, excellent post, no uncertainty there 😉

Also, expressing uncertainty in an honest fashion will get you more trust and co-operation from the customers than if you make pronouncements from on high that they don’t actually believe.

by shrdlu on April 5, 2010 at 2:23 pm. Reply #

It’s worthwhile though in a business setting as the “expert” in security/risk to put a stamp on it and MAKE THE CALL.

I agree, generally, that we cannot be definitive in every instance, but the business wants to be assured that we’re making a fairly accurate assessment on the probability and the impact of any given situation. Here, uncertainty clouds the conversation and personally prefer to message my strong beliefs.

When asked to make calls on topics less precise, I agree that we should stipulate uncertainty. To borrow from science–uncertainty, precision, and error play a role in these assessments and worthwhile to call what it is.

Our main purpose is to be the trusted advisor.

by PhilA on April 6, 2010 at 3:53 am. Reply #

Right on Alex. When I am training “new” risk assessors I tell them the following:

1. You have to be 100% certain that the issue you are documenting is indeed an issue. This is one of the first credibility checks.
2. In cases where there is a lot of uncertainty in the frequency and magnitude – err on the side of more information gathering before declaring a qualitative risk rating. Partner with the non-security SMEs.
3. Use modifiers appropriately. For some issues, there are other attributes associated with the issue that are driving that uncertainty; from a FAIR perspective think UNSTABLE or FRAGILE.
4. Go into the “uncertainty” conversation with some estimates of what the opportunity cost is (time and or money) to reduce uncertainty further. (Hubbard)

Finally, it takes special skills to be able to consistently have effective risk discussions. I have worked with some great risk assessors that fall short on the communications side; sometimes resulting in poorly informed decisions by management.

by Chris Hayes on April 6, 2010 at 12:36 pm. Reply #

“Scientists Need The Guts To Say, “I Don’t Know”

Exactly, this is why Weathermen are more accurate in predictions than doctors. Weathermen know that they are just guessing and their predictions are close to 50%, doctors wrongly assume they can predict and so their prediction rate is around 15%.

I recommend to you the work of James Montier


by gunnar on April 6, 2010 at 3:46 pm. Reply #

I’ve found that the key to successfully managing uncertainty is to explain what I know about a situation or decision versus where I’m making a best guess or extrapolating, and based on what experiences or inputs.

Even showing that you “know what you don’t know” goes a long way towards managing uncertainty as a business problem. As you noted, this is also often a great opening to say, “and if you give me $funding we can decrease that uncertainty by gaining visibility into $whatever”.

Sometimes, though, it’s easier to just answer with, “It depends” and go on.

by Chandler Howell on April 6, 2010 at 6:44 pm. Reply #

Leave your comment

Not published.

If you have one.