The information security community loves lists, cycles, and other guides for actions. We have important steps that need to be followed, but no investigation is exactly the same, and every one requires a bit of improvisation.
So how do you balance these needs? The paradox of disciplined steps mixed with room to adapt to a situation. Lots of groups are posing solutions, some useful, some not so useful, and some 100% misconstrued.
The OODA Loop
OODA is easily the most misunderstood cycle on this list. It is simple but powerful, so powerful it causes people to want to over think it. At it’s core OODA is an abstraction of the typical human thought process. Developed by fighter pilot and tactician Col. John Boyd to understand fighter engagements the OODA loop is applicable to most decision cycles, including incident response decisions.
Boyd's OODA Loop / Wiki
While the annotated version above is complex at it’s core the OODA Loop is made up of four simple steps.
➢ Observe: Gathering data that may be relevant for action.
➢ Orient: Next information is contextualized by the decision maker.
➢ Decide: Finally based on the contextualized data a decision is made.
➢ Act: The decided upon course of action is carried out.
At this point the cycle restarts based on observing the results of the action.
OODA in Incident Response
“Under OODA loop theory every combatant observes the situation, orients himself…decides what to do and then does it. If his opponent can do this faster, however, his own actions become outdated and disconnected to the true situation, and his opponent’s advantage increases geometrically.” - John Boyd
So how does this actually apply to incident response? In a tactical sense this happens all the time:
➢ Observe: Monitoring via network (Intrusion Detection), system (Anti-virus), and environment sensors (Social Network Monitoring).
➢ Orient: Data is compared against known bad indicators, known good white lists, and considered based on analyst experience.
➢ Decide: An analyst decides what to do based on data discovered. This could be to decide it is a true positive (start remediating), a false positive (make a plan for tuning), or inconclusive (gather more data).
➢ Act: The analyst initiates their plan to either start remediating, request tuning, or continue gathering more data.
And the cycle begins again. Where this gets even more fascinating is when you compare this against the OODA loop being carried out by an attacker:
➢ Observe: The attackers gather information based on implants, usually including scanners, keyloggers, etc.
➢ Orient: Access available is contextualized based on goals at the current phase of a compromise, whether persistence is achieved, if current access is specific, etc.
➢ Decide: The attacker makes a plan to expand their access, solidify current access, evade investigators, or complete objectives (steal data, turn up centrifuges, etc).
➢ Act: The attacker executes their plan using the capabilities they have available.
These two loops are running simultaneously, competing with each other every second. The fact is whichever actor can complete this loop faster, continuously, will be most likely to achieve their objective. Over time it is almost guaranteed that a group that OODAs faster will always have the decision / action advantage, and thus more likely to be more successful. The big caveat to this is in an adversarial setting these loops are occurring simultaneously, meaning that the data collected during the observation phase might not be accurate by the action phase. This is why the relative speeds each actor can complete the loop are so key.
Takeaways from OODA
The great thing about OODA is it gives us a framework to talk about how we make decisions. At it’s core the thing Col. Boyd was able to show in a concrete way something everyone implicitly understands: whoever makes decisions and acts fastest has a distinct advantage.
“The future has already arrived. It’s just not evenly distributed yet.” - William Gibson
For incident response our goal needs to ultimately be to OODA faster than attackers. In most cases at it’s core improving a DFIR capability is all about being able to OODA faster as a team. New sensors improve your ability to observe while threat intelligence data improves orientation. You can’t really put OODA into practice, but it can be used to structure how you think about decision making, and what you can do to improve the speed with which you make decisions.
Ideas for Orient
1. Build a robust toolbox of mental models.
The more mental models you have at your disposal, the more you have to work with in creating new ones. During a presentation at the Air War College in 1992, Boyd warned his audience of the way in which strict operational doctrines can stifle the cultivation of a robust toolbox of mental models:
“The Air Force has got a doctrine, the Army’s got a doctrine, Navy’s got a doctrine, everybody’s got a doctrine. But if you read my work, ‘doctrine’ doesn’t appear in there even once. You can’t find it. You know why I don’t have it in there? Because it’s doctrine on day one, and every day after it becomes dogma. That’s why….”
Doctrines have the tendency to harden into dogmas, and dogmatism has the tendency to create folks with “man with a hammer syndrome” – it causes people to keep trying to apply that same old mental model even if it’s no longer applicable to the changing environment. You see “man with a hammer syndrome” in businesses that stick to a tried and tested business model even though the market is moving in another direction. Kodak, as mentioned above, is a perfect example of this. So too is Blockbuster. They continued making hard-copy movie rental a primary part of their business even though more and more consumers were streaming movies via the internet. Blockbuster eventually tried to shift their business model, but it was too little, too late.
You also see “man with a hammer syndrome” in folks who discover some pet theory and start applying it to every. single. situation in life without considering other factors. People who are fans of evolutionary psychology are prone to this. To them, all human behavior can be explained by it. Why are men more jealous than women? Because in primitive times they couldn’t know if they were really the father of a baby or not. Why do we get depressed? It used to help people concentrate on their problems and figure out how to remove themselves from a bad situation. While our evolved psychology certainly plays a big role in our behavior, other factors are also involved in why we do what we do. It’s foolish to discount those.
It’s for this reason that Boyd advocated for familiarizing yourself with as many theories and fields of knowledge as possible, and continuing to challenge your beliefs, even when you think you’ve got them figured out:
“Well, I understand you’re going to have to write doctrine, and that’s all right… But even after you write it, assume it’s not right. And look at a whole lot of other doctrines – German doctrine, other kinds of doctrines – and learn those too. And then you’ve got a bunch of doctrines, and the reason you want to learn them all is so that you’re not captured by any one, and you can lift stuff out of here, stuff out of there…. You can put your snowmobile together, and you do better than anyone else. If you got one doctrine, you’re a dinosaur. Period.”
The more doctrines, or mental models, we have at our mental fingertips, the more materials we have from which to construct our figurative snowmobiles.
Charlie Munger advanced a similar argument for the necessity of having a widely varied library of mental knowledge in a speech he gave at the USC Business School in 1994:
“You’ve got to have models in your head. And you’ve got to array your experience — both vicarious and direct — on this lattice work of models. You may have noticed students who just try to remember and pound back what is remembered. Well, they fail in school and in life. You’ve got to hang experience on a lattice work of models in your head. What are the models? Well, the first rule is that you’ve got to have multiple models — because if you just have one or two that you’re using, the nature of human psychology is such that you’ll torture reality so that it fits your models, or at least you’ll think it does…So you’ve got to have multiple models. And the models have to come from multiple disciplines — because all the wisdom of the world is not to be found in one little academic department.”
Munger has repeatedly emphasized in his speeches that reality is an interconnected ecosystem of factors that influence one another. Thus, to understand this ecosystem, you need to apply multiple models in an interconnected fashion. John Muir put it best when he noted: “When we try to pick out anything by itself, we find it hitched to everything else in the universe.”
So all this talk of having multiple mental models begs the question: what sort of models should you put in your toolbox?
Both Boyd and Munger give some suggestions. In his presentation of Strategic Game of ? and ?, Boyd lays out seven disciplines every military strategist (or any person strategizing how to win any kind of conflict or competition) ought to know:
➢ Mathematical Logic
➢ Conflict (Game Theory)
Boyd emphasized that his list was of course not exhaustive and that other mental concepts should be pursued as well. In other presentations, Boyd hinted that biological evolution and quantum mechanics are additional mental models every master strategist should have a grasp of.
Munger’s list includes the following mental models:
➢ Math (Munger is particularly fond of the algebraic idea of inversion, that is, to solve a problem you address it backwards)
➢ Accounting (and its limits)
➢ Engineering (according to Munger, the ideas of redundancies and break-points are applicable outside of engineering and can be applied to business)
➢ Psychology (specifically the cognitive biases that cause us to make terrible decisions)
➢ Evolutionary biology (can provide insights into economics)
I would personally add philosophy, literature (and its accompanying models of interpretation), and basic common law principles (like torts, contract law, and property law) to the list.
Because Boyd and Munger are thinking “Big Picture,” their mental model examples are purposely general and abstract. But it’s important to remember that mental models can be specific and concrete. To thrive in your job, you’ll need certain mental models specific to your career. To survive a lethal encounter, you’ll need certain mental models unique to tactical situations. Learn as many mental models as you can, and create as exhaustive a lattice work as possible, so you have more to work with in the creation and destruction process.
Some of these subjects can certainly be intimidating for people with no experience in them. To get started, take a look at the resources section in our article on lifelong learning – particularly the online courses. Coursera, for instance, has a number of introductory classes on calculus, econ, competitive strategy, etc.
2. Start destroying and creating mental models.
Your fluency in destroying and creating mental models will only come with practice, so start doing it as much as you can. When you’re faced with a new problem, go through the domains above in a checklist-like fashion and ask yourself, “Are there elements from these different mental concepts that can provide insight into my problem?”
Perhaps there’s a principle from engineering, the works of Plato, and biology that can help create a new mental model that matches up with your new reality.
Start a journal with your destruction and creation experiments. Suss out new mental concepts with writing and doodles. You may be surprised by the insights you’ll gain from this exercise.
As you practice destroying and creating mental models, you’ll find that it will become easier and easier to do on the fly. It will become almost intuitive. In Mastery, Robert Greene described the great military strategists from history as having a “fingertip feel” for knowing how to proceed on the battlefield. These great strategists simply were effective and efficient at orienting. They didn’t have to deliberately think about the process, they just did it. That should be your goal.
3. Never stop orienting.
Because the world around you is constantly changing, orientation is something you can never stop doing. “ABO = Always Be Orienting” should become your mantra. Make it a goal to add to your toolbox of mental models every day, and then immediately start atomizing those models and fashioning new ones.
4. Try to validate mental models before operation.
Ideally, according to Boyd, you want to be fairly confident that your mental models or concepts will work before you actually need to use them. This is especially true in combat or life-or-death situations where rapid cycling of the OODA Loop is crucial (more on tempo below).
How do you validate mental models before operation? You study what mental concepts have and haven’t worked in similar situations and then practice, train, and visualize using those mental concepts. Think of the situation where a basketball team is losing the game by one basket, there’s just seconds left on the clock, and they’re inbounding the ball. They’ve already spent weeks practicing specific plays designed for these specific circumstances and now they just have to execute that plan.
Having field-tested mental concepts at the ready is important even when time isn’t of the essence. In business, you can read case studies of what has and hasn’t worked for other companies and have models, concepts, and strategies at the ready that you can implement immediately when similar situations arise. Of course, if those don’t work, you’ll need to continue the process of orientation until you create a new mental model better suited for the situation.
When your observations about your environment match up with certain proven mental models, you don’t have to do any destroying and creating, you just have to act. If you look at the complex diagram of the Loop above, you’ll notice that Boyd makes room for the ability to skip the “Decide” step — note the line that goes from “Orient” to “Act” and bypasses “Decide.” Boyd called the ability to quickly orient and act, “Implicit Outcome and Control.” It’s something similar to the “fingertip feel” Greene talks about. A person who has achieved mastery in a specific domain should be able to quickly notice when reality lines up with a specific mental model and then execute that mental model without having to decide. You just act.
I can’t hit home hard enough the importance of the orientation step. It’s at the heart of the OODA Loop and is what determines your successful implementation of it. If you don’t act with the mental model that matches up closest to your environment, you’re going to lose no matter how quickly you cycle through the Loop.
ABO = Always Be Orienting.