I've spent over a decade watching organizations get compromised - both as an attacker and as the person called afterward to help ensure they don’t happen again.
The Path of Least Resistance
The pattern is always the same: a series of small mistakes that compound into catastrophic failure. Each oversight seems minor in isolation. But like water leaking through a foundation, security flaws allow hackers to follow the path of least resistance to your data.
Let me tell you about one particularly illuminating case. The details have been changed to protect the guilty innocent, but the lessons remain painfully relevant.
When Security Only Looks Good in Theory
On a Tuesday morning in March, Sarah Chen, Acme Corp's newly appointed CISO, stood in front of the board trying to explain how $47 million in customer data had been stolen. Her predecessor had left her a "fortress" - at least that's what the compliance documents claimed. Multiple security certifications hung on the wall behind her. The crown jewels, their data processing system, sat behind state-of-the-art access controls and encryption. The board had approved millions in security spending over the years. So how were they here?
The answer began, as many security catastrophes do, with something seemingly robust and innocuous: the company's website, running on a corporate content management system (CMS). Like many enterprises, Acme had checked all the right boxes. They had their dedicated security team, regular penetration tests, and a thick binder of security policies. But what they didn't have was institutional memory or enough focus on systems thinking.
The CMS itself was unremarkable - a standard enterprise solution that dutifully received its patches. But beneath its modern interface lay years of accumulated complexity, each feature and configuration option a small landmine. While it came with decent security defaults, no one had ever performed a systematic hardening review. The security team was too busy fighting other fires, the marketing team just needed to be able to add content, and besides, it was "just a website."
The First Domino Falls
The breach probably stemmed from Alex, a junior IT tech who had joined during the pandemic. (We’ll excuse for now leadership’s lack of secure archicture and configuration initiatives). Tasked with supporting the installation of a modern CMS, he needed power users on the Marketing team to be able to easily upload image assets. He found a Stack Overflow post that solved his problem and made a configuration change. Power users were happy, the ticket was closed on time - what could go wrong?
Nine months later, at 3 PM on a Friday, that seemingly innocent file upload function became the company's undoing. Through careful research and some luck, we discovered that one could bypass the upload sanitization by using a specific combination of file extensions. One carefully crafted file later, we had our foothold on one of the CMS servers.
In a properly segmented network, this should have been contained. The CMS was supposed to be isolated zone, like a vestibule or sally port. But two years earlier, during a push to "improve documentation workflows," someone had punched a few holes in the walls. It was supposed to be temporary. It wasn't.
That hole led to an internal file server, which had access to an engineering team's backup system. Within those backups lay the keys to the kingdom: production access tokens, carelessly included in system snapshots. The encryption was robust, but in a cruel twist of irony, the decryption keys sat in the same backup set - a decision made during a late-night deployment when someone probably decided it would "make disaster recovery easier."
From there, the path to the financial data was a bitter lesson in the difference between security theater and security reality. The network segmentation that looked so good in architecture diagrams had slowly eroded through years of "just this once" exceptions. The production credentials were gold, providing direct database access. While the customer records were encrypted, we found a decryption library sitting in an internal code repository, documented with helpful examples.
The final act - exfiltrating the data - was anticlimactic. The compromised CMS server, still dutifully serving website visitors, became the perfect conduit. After all, who would question outbound web traffic from a web server?
As Sarah concluded her presentation, the real lesson became clear. Security isn't just about technology - it's about people, processes, and technology (architecture or systems thinking). Each decision that led to this breach had seemed sensible in isolation (at least the incentives were logical). But together, they created a path through their paper fortress, one small compromise at a time.
The Hidden Cost of "Just This Once"
The unsettling part? Few of the individual decisions that enabled this attack were obviously wrong in isolation. Each one (aside from the exposed backup file) had a legitimate business justification at the time. But together, they created a vulnerable pathway through virtually every defense.
Sometimes, the most dangerous security weaknesses aren't the obvious ones. They're not the unpatched systems or the weak passwords. Those are identified through audits and penetration tests. The real dangers are the accumulated exceptions - the small compromises made over years of operations, each reasonable on its own but collectively adding straw to the camel’s back.
And so, the missing security controls we abused do matter, but an interesting takeaway is about how we think about security exceptions and medium or low findings. Every time we make an exception to security policies or deprioritize findings, we're not just poking one hole - we're potentially creating one piece of a path that could later connect to other holes.
Why Traditional Security Audits Fail
The solution isn't to never make exceptions - that's unrealistic in a business environment. Instead, we need to think about exceptions differently. Treat them like technical debt that needs to be regularly reviewed and refactored. Document them. And, importantly, trace how they could potentially interact with each other.
In Acme's case, they've since implemented more robust controls and risk management processes. Security exception requires documentation of the business justification and a more detailed analysis of how it could potentially affect other systems. Exceptions are automatically flagged for review, and there are dedicated red team exercises specifically focused on identifying these attack paths.
The sad reality is that most major security breaches aren't the result of sophisticated zero-day exploits. They're the result of attackers finding creative ways to connect the dots between existing weak points. As information security leaders, we need to get better at identifying these connections before they do. It isn't about building perfect walls. It's about understanding how imperfections can connect in unexpected ways. The most dangerous vulnerabilities often hide in the shadows of our assumptions.
Connect the Dots Before Attackers Do
The critical vulnerability here wasn't a misconfigured CMS - it was the backup system containing SSH keys and tokens.
The attack worked because of a chain of "reasonable" exceptions. You may segment networks, but are they restricted like you expect after many years?
A complex CMS compromise is almost inevitable, given enough time. Instead of trying to prevent it entirely:
Get creative, whiteboard, and think through the architecture of your systems, connections, and users. Attackers think in graphs, defenders think in lists. Don’t let this be true of your organization.