The other day I was talking with a leadership team about their upcoming cybersecurity compliance audit. They were stressed about documentation, checking boxes, and meeting arbitrary deadlines. What struck me was how none of their concerns had a material impact on their actual security. This pattern keeps repeating: companies confuse compliance with security and, in doing so, make things more complex and in the process, less secure.
I think this happens because of a fundamental misunderstanding about how complex systems stay secure. We want to believe that if we just write enough rules and build enough boundaries, we'll be safe. It's a comforting thought. But it's wrong.
Richard Cook, an anesthesiologist, gave a talk back in 2013 on “Resilience in Complex Adaptive Systems,” which probably only sounds interesting if you’re into risk, but it’s worth the 20-minute watch!
Most compliance frameworks are built around a simple idea: define boundaries, enforce them, and security will follow. Anyone who's actually operated complex systems knows this isn't how things work. What's remarkable isn't that systems fail - it's that they work at all.
Take log monitoring. Every compliance framework requires it. Seems reasonable, right? But watch what happens in practice: understaffed IT teams drown in alerts, eventually becoming numb to them. The compliance box is checked, but security actually decreases as they run up against the Unacceptable Workload Boundary. I've seen this pattern so many times it's almost predictable.
The problem isn't just that boundaries are ineffective. They're actively dangerous when people start believing in them too much. There's this phenomenon called "normalization of deviance" that explains why.
As Cook describes, imagine you're teaching a three-year-old to stay out of the kitchen while you're cooking. They'll stand right at the boundary, then stick a toe over. Nothing bad happens. They'll do it again. Still fine. Gradually, they'll push further and further until they're standing right next to the stove. This is precisely how organizations handle security boundaries.
That "temporary" admin access that never gets revoked? The security bypass that becomes permanent? These aren't (yet) accidents. They're the natural result of successfully pushing boundaries without immediate consequences.
What's really insidious about the compliance trap is how it distorts incentives. I've seen companies spend so many thousands of dollars on compliance and tools while understaffing their security teams and underleveraging the solutions they have.
And compliance itself takes away resources from important things like thinking through security architecture and improving systems, not to mention the opportunity cost of not building actual security capabilities
The best-run systems I've seen don't succeed because of their compliance programs. They succeed because they've built resilient operations with deep system knowledge. This isn't just theory - you can see it in action if you know where to look.
I know a company that spends less on security tools than their competitors but has far fewer incidents. The difference? They invest heavily in their operators' capabilities. Their team can tell you exactly how their systems behave, where the weak points are, and how they'd detect and respond to various scenarios. No compliance framework measures this, but it's what actually matters.
The good security teams I know measure things like:
Notice how none of these are compliance metrics.
If you want to build genuinely secure systems, you need to flip the traditional compliance model on its head. Instead of starting with frameworks and working inward, start with operators and work outward.
This means:
The hard truth is that no compliance framework will make you secure. Security isn't a state you achieve - it's a capability you build. And like any capability, it comes from continuous practice, learning, and adaptation.
The biggest obstacle to better security isn't technical - it's psychological. We want security to be something we can buy, install, or achieve through compliance. The reality is messier. Security is more like health than a certification - it requires constant attention and adaptation.
This is why experienced leadership matters so much. Good leaders understand that security isn't achieved through checkbox compliance but through building resilient systems and empowered teams. They know that real security comes from operators who understand their systems deeply and can adapt to changing conditions.
The companies that get this right don't just pass audits - they build genuine security capabilities. They measure what matters, invest in their people, and focus on actual resilience rather than compliance theater. It's harder than just following a framework, but it's the only approach that actually works.
Rather than chasing compliance checkboxes, organizations should focus on:
The choice is yours: you can optimize for passing audits, or you can optimize for actual security. Just don't fool yourself into thinking they're the same thing.