Field Notes

Gambling and security, a reflection

Sometimes, even when you win, it’s because you made a bad move. As security leaders, we often have more in common with gamblers than we do with engineers.

Gambling and security, a reflection

Sometimes, even when you win, it’s because you made a bad move. As security leaders, we often have more in common with gamblers than we do with engineers.

While engineering tends to be fairly deterministic, security, much like cards, leans heavily on probabilities. Our problem space extends beyond the systems we manage and into the humans populating the real world who act in not-always-rational ways.

When things go wrong in information security - there’s a breach, a leak or other impact that we aim to prevent, it doesn’t necessarily mean that something went wrong with our system or planning. It’s entirely possible for you to have a breach while running a tight security program. Likewise, you may never experience a breach despite an extremely inefficient and misguided program. Understanding the right lessons to take away from each experience is what will separate the lucky from the best.

You could cover all the basics well and still have a major data leak due to an engineer exfiltrating data. Ideally, you should have caught and prevented that, but in practical terms, it depends on your threat model, risk assessments, and other tools used to inform your priorities.

Your tools may have highlighted that external risks took precedence over the insider threat risk posed by an engineer. Does that mean the tools were wrong? Again, not necessarily.

Just like a card player with statistically optimal play isn’t guaranteed to win every hand, a security leader basing their decisions on the best available data and risk calculations won't prevent every security incident.

Possibilities in infosec are infinitely more numerous and complex than that of card games. This has the added joy of making the probability calculations fuzzier than month old yogurt from the back of the fridge - something most of us have experienced firsthand when trying to convince a skeptical CFO why we need budget to prevent an adverse event with a 73.49% chance of occurring.

This inherent complexity and uncertainty makes continuous analysis and updates to your decision making framework the key to long term success.

For the insider threat example, you may want to ask - Did I correctly assess the risk of insider threat? Did I overestimate the external threats? Were the controls I picked the best choices? Why and what could I do to improve going forward?

This constant iteration is the only way to drive long term success, honing your models, decision making and communication over time to reduce uncertainty and drive better outcomes.

We may not be able to make the probability less fuzzy but we can improve how we account for it in the way we convey risk and make decisions.