What is Information Security: New School Primer

by alex on November 28, 2010

Recently, I’ve heard some bits and pieces about how Information Security (InfoSec) can be “threat-centric” or “vulnerability-centric”.  This stuck me funny for a number of reasons, mainly  it showed a basic bias towards what InfoSec *is*.  And to me, InfoSec is too complex to be described as “threat-centric” or “vulnerability-centric” and yet still simple enough to be described at a high level in a few paragraphs in a blog post. So I thought I’d write a “primer” post on what InfoSec is to create a reference point.

First, InfoSec is a hypothetical construct. It is something that we can all talk about, but it’s not directly observable and therefore measurable like, say, speed that we can describe km/hr.   “Directly” is to be stressed there because there are many hypothetical constructs of subjective value that we do create measurements and measurement scales for in order to create a state of (high) intersubjectivity between observers (don’t like that wikipedia definition, I use it to mean that you and I can kind of understand the same thing in the same way).

Because it’s a hypothetical construct, what is “secure enough” is also subjective to the observer, and a subjective assessment that is then immediately, almost subconsciously compared to the relative risk tolerance of the owner in their mind. This presents many challenges in managing a security program, not the least of which is establishing that high degree of intersubjectivity, above.

Second, security is not an engineering discipline, per se.  Our industry treats it as such because most of us come from that background, and because the easiest thing to do to try to become “more secure” is buy a new engineering solution (security product marketing).   But the bankruptcy of this way of thinking is present in both our budgets and our standards.   A security management approach focused solely on engineering fails primarily because of the “intelligent” or adaptable attacker.  For example, if security were pure engineering, it would be like building a bridge or getting an airplane in the air.  In these cases, the forces that are applied to the infrastructure do not adapt or change tactics to cause failure.  At worst, in engineering against nature we only have a difficult time adapting to forces unforeseen due to a combination of factors.

But InfoSec has to deal with the behaviors of attackers.  Their sentience includes creativity and adaptability.  The wind does not act to deceive.  Gravity and rust do not go “low and slow” to evade detection.  Rain does not customize its raindrops to bypass umbrellas.  But sentient attackers do change to evade defenses and reach their goal.

And because InfoSec is not solely a “computer/software engineering” problem, it requires an understanding of both technology and non-technology fields.  Yes, this includes software engineering, hardware engineering, and network engineering – but it also means concepts like management science and behavioral analytics (among many others)  should have their place in understanding all the phenomena that creates a state of “secure”.

For example the outcome of having more than a “vulnerability-centric” view of InfoSec (from above) is that “secure” would reasonably measured by understanding both the force that the attackers can apply, and our ability to resist that force (1).  In this way,”threat-centric” security (study of ability to apply force) is as useless without “vulnerability-centric” security (study of the ability to resist).  It’s like trying to measure “distance-centric” speed without it being relative to “time-centric” speed, and is as equally useless.

Finally, InfoSec is a subset of Information Risk Management (IRM).  IRM takes what we know about “secure” and adds concepts like probable impacts and resource allocation strategies.  This can be confusing to many because of the many definitions of the word “risk” in the english language, but that’s a post for a different day.

THE NEWSCHOOL APPROACH – THE MONEYBALL-ING OF INFORMATION SECURITY?

Moneyball is a fantastic book about how new approaches to measuring and modeling the performance of baseball players created market efficiencies for those baseball teams who were better able to use the data they had at hand.  In a sense, the New School of InfoSec seeks to foster the collection of data and the development of new and better models for managing security.

But that simplistic analogy belies other important concepts.  A call for the application of scientific method, the recognition that our standards are really just hypothesis about “how to secure”, the requirement that claims of success be backed up with data and not just a logical argument or isolated anecdotes, data sharing, transparency – these are all fundamental premises, needs even, of the New School of Information Security.

Because Security is a hypothetical construct, one that requires a great deal of intersubjectivity and a broad array of applicable knowledge  to understand, the evidence of history suggests that a New School approach – a scientific approach, is the most efficient way of making progress.

—-

(1) Jack Jones – The Factor Analysis of Information Risk

11 comments

One of the few measures I know of that most people understand when it comes to security is, “Are we still surviving?” or “Have we been embarassed/threatened by a security event?” Both of which are sort of after the fact…

I liked my first read of this, until I got down to the last 2 paragraphs. I got the feeling that you were actually arguing *against* being able to measure these things well until those paragraphs. Then I got the feeling you want to say we need to measure better so that we can better defend our hypotheses? Maybe I’ve been thrown off by the dropping of the term, “scientific method.” That alone implies, to me, the ability to hypothesize, test, measure, recreate.

Maybe you’re saying security is strange, but we need to find new ways to measure so that we can achieve a better scientific method? Of course, I’d probably buy that if coupled with what those measurements are. 🙂 The nice thing about data taken from sports (football, baseball…) is even when shuffled and taken in new ways (passer rating against above .500 teams that also have injuries but whose defensive pass rating is in the top 50%…), they’re still not very subjective. Either someone completed a pass or not. Either someone stole a base when the pitcher didn’t check the runner or not.

PS: I think there is a word misplaced in sentence 1, paragraph 7. 🙂

by LonerVamp on November 28, 2010 at 5:49 pm. Reply #

I may have made things too conveniently complex on this subject (i.e. complicating things just to further my own point), especially when I poked at the sports stats. Perhaps there is better measurement derived largely from the audit/compliance arena… Maybe I just resist the jump from infosec following “best practices” to having to justify (measure) why best practices are best practices… *shrug* Who knows… Still, I like this post a lot! 🙂

by LonerVamp on November 28, 2010 at 5:57 pm. Reply #

>”I liked my first read of this, until I got down to the last 2 paragraphs. I got the feeling that you were actually arguing *against* being able to measure these things well until those paragraphs.

Maybe you’re saying security is strange, but we need to find new ways to measure so that we can achieve a better scientific method?”
>
Perhaps not “strange” but rather, “complex”. And I’m presupposing that measurement requirement is a given. Maybe I should add a paragraph about the necessity of measure, but that’s kind of Adam and Andrew’s book (and Andrew Jaquith and others). I was focusing on the nature of that measurement.

>The nice thing about data taken from sports (football, baseball…) is even when shuffled and taken in new ways (passer rating against above .500 teams that also have injuries but whose defensive pass rating is in the top 50%…), they’re still not very subjective.< Beautiful. It's a whole other blog post. LV - read your first paragraph and think hard about your last. Change the words subjective and objective and make them "uncertain" and "concrete". The outcomes are, for all purposes, concrete -"Sports Win", "InfoSec Failure". The "cause" is more uncertain, up for study if you will, esp when trying to identify what to do next. Sabermetricians (Moneyballers) understand the value of studying performance independent of the outcome result. Great example: Andy Murray lost a tennis match yesterday to Rafa Nadal. Murray actually won more games than Nadal, but lost two sets in the tie-braker. Given the relative randomness in any given point or small sets of points - it's hard to argue that Nadal played a better match than Murray *though the outcome was certain for Nadal* (Nadal advanced and is playing Federer now as I write this). In the quest to address future uncertainty (i.e. who will win the next time they meet), would you discount Murray's performance today because of the certainty in the outcome? Of course not! If you were Murray's coach, should you break down their respective tennis games to focus on aspects of Murray's game to optimize performance next time (aces/double faults, IMHO) - sure! In InfoSec, we're going to have to do a better job at collecting, sharing and correlating metrics to outcomes - both ultimate and contributory.

by alex on November 28, 2010 at 6:11 pm. Reply #

Your last sentence really sums it all up, for both of us: “In InfoSec, we’re going to have to do a better job at collecting, sharing and correlating metrics to outcomes – both ultimate and contributory.”

Amen! 🙂

I buy what you’re saying, and it makes sense! You completed my thought nicely with the outcome part: how you can look at concrete stats and still be uncertain of the outcome. That might underlie what caught my eye about scientific method, since the “repeat” part of that is really important.

by LonerVamp on November 28, 2010 at 6:16 pm. Reply #

I’ve worked a lot with people who have attempted to solve solutions without having the good fundamentals of engineering and system thinking concepts under their belt and, let me tell you, they usually are not very successful in implementing an effective solution in terms of cost or resources.

I’m sure that it’s possible to produce successful work in a technological security arena without implementer know-how, but it’s quite rare in my experience and usually involves leveraging some unseen skill. This is where I usually use that cobbler gnomes analogy that I seem to be fond of employing.

More and more I’m becoming convinced that a single philosophical approach to quality is not sufficient when it comes to managing elements of these complex systems. When I was helping a friend of mine, a lawyer, prepare to chair a compliance conference, that information security/compliance/assurance/whatever is very much like economics in that people can be very well qualified and still have massive disagreement on advisable courses of action. Sometimes even base principals.
http://www.youtube.com/watch?v=d0nERTFo-Sk

I also very much appreciate how you, like myself, advocate collection of data to actually demonstrate the effectiveness and success of a project or program.

As you say, the management science and behavioral understanding seems to be the most elusive skillsets of those actually tasked with employing success in these areas.

Hopefully, either through data driven metric and framework efforts or a management fashion movement that can instil , we can emerge from the Vendor Compliance Industrial Complex that we as an industry seem to be trapped in at present. Until then, I’ll continue to enjoy employment remediating the latest Compliance In A Box or SDL Made Easy programs 🙂

What I would like to hear more of are accounts of people who have successfully made programs that are operationally spendthrift or efficient instead of just not an object failure, are winning the competition in number of products deployed, legions marching, or amount of obvious statements written in trade journals. Then again, as I’ve seen mentioned recently, CISOin’ ain’t easy.

I’m hoping that when the real total costs of operational inefficiency, consulting, contractors, low-bid outsourcing, and other high operational and remediation costs are accounted for in the same way budgets and staffing levels are done. Take it from me; the more consultants and contractors on permanent staff, the less likely the organization is healthy.

I’m totally going to use the Moneyball example with super sportsfans. 😀

by Ian on November 28, 2010 at 7:38 pm. Reply #

Oh. I didn’t complete my thought at the bottom there. What I said. What I was intending to say was that once the full cost of perceived piecemeal cost savings are fully appreciated

For example, I’ve seen circumstances where pay rates could be employed to keep rock stars on staff instead of paying ten to twenty times that cost (or more) to staff an outsourced team without the insider knowledge to be ideally effective.

Paying world class technology managers like world class financial managers has not been acceptable even though many of the exposures are at least as severe. It’s been left to the octopus of human resources to manage compensation and people they do not understand.

Another facet of the systematic problem, but possibly one that can be addressed when there is a sufficient amount of data.

by Ian on November 28, 2010 at 8:29 pm. Reply #

Alex – it’s like you covered about 4 books worth of topics in this blog post, impressive compression-ratio. This post is fueling thoughts I’ve been having about the role of InfoSec and how success is measured. To your tennis analogy, we cannot measure failure in terms of an individual security breech, nor is success marked by a lack of a breach. You nailed it with, “studying performance independent of the [individual] result” (having no knowledge of “MoneyBall”, I updated your quote, hopefully it fits). That means we want to measure the effectiveness of actions across the board and correlate that to breach methodology/data.

But you also bring up the adaptive adversary. With that in mind not only do we need all sorts of data, but we need to understand when that data should be challenged/updated because of adversarial adaptation, that ain’t the case with physics as you point out. Nice post.

by Jay Jacobs on November 29, 2010 at 3:32 am. Reply #

Hey, Alex –

Interesting post –

I will remind you of our discussion in Barcelona about decoupling threat and event frequency from loss magnitude, and why, in many circumstances, it’s just impossible to make a reasonable estimate of event frequency. We still need to understand threats, controls, and vulnerabilities, but how to relate them is maybe not as straightforward as we might have thought in our naïve FAIR days.

Re: tennis and Nadal v Murray – I would argue that competitive tennis is all about risk management. Given the structure of the game and its scoring, the winner of a close match is the player who manages/wins more of the key points. In a close match, it’s usually the player who out-thinks/out-adapts the other.

Do you remember how Sampras, at the end of a set when he was up a break, would appear to put things into low gear on his opponents serve? Just kind of let the other guy hit aces, conserved his energy, and then served it out with four serves at 5-4?

It may make us feel good to battle fiercely to 40-40 in each game, but if we lose the next two points, we lose the game, and the set, maybe 6-0

For this reason, I would disagree with your statement that Murray played a better match than Nadal.
Nadal won – that’s the metric that matters in the end. It may benefit Murray to study how it happened that he lost the key points, but he probably knows that already, and knowing vs doing at that level is not necessarily easy. I saw a statistic a while back that stated that Federer, when he was the undisputed best player in the world, won only 55% of points, on average.

Maybe the same thing in infosec – we do have all these metrics and measures that we can track – but at the end of the day, it really comes down to whether we won or lost, and how much it costs us when we lose.

by Patrick Florer on November 29, 2010 at 3:07 pm. Reply #

Patrick Florer states that the only metric that matters is who won the match. Similarly, I would say that the only metric that counts is whether end behaviors took place other than those end behaviors listed and pre-authorized as acceptable according to the business rules, which ultimately can be broken down into an user allow or deny decision, (data access or a functional executable).

There is nothing subjective about assessing whether those rules were upheld; they either were or were not as determined by observable behaviors and outcomes. (I believe you wrote a post about this once)

This rules out arguments about engineered solutions failing to adapt to creative attackers. The business rules should be a constant suporting the goals of the business, and thus your controls need to be as well.

Obviously this is a different model, but until infosec maps more directly to the business rules, metrics will lack utility. A pretty good post just the same. Just my 2cs.

by Rob Lewis on November 30, 2010 at 1:25 am. Reply #

[…] “vulnerability-centric?” The New School of information Security answers this question for us. Click here to read what they had to say and it’s a guarantee good […]

by Top 3 NoVA Infosec Blog Posts of the Week | NovaInfosecPortal.com on December 3, 2010 at 3:19 pm. Reply #

[…] “vulnerability-centric?” The New School of information Security answers this question for us. Click here to read what they had to say and it’s a guarantee good […]

by Top 3 NoVA Infosec Blog Posts of the Week - ?????? Information Security Voice Of Chinese on December 4, 2010 at 4:23 am. Reply #

Leave your comment

Not published.

If you have one.