Last updated at Fri, 30 Aug 2024 19:31:43 GMT

I woke up from a dream this morning. Maybe you can help me figure out what it means.

Your company hired me to build a security program. They had in mind a number of typical things. Build a secure software development lifecycle so app developers didn't code up XSS vulnerabilities. Improve network security with new firewalls, and rolling out IDS sensors. Set up training so people would be less likely to get phished. Implement a compliance program like NIST or ISO. And you wanted all of that rolled out in a way that didn't disrupt the business or upset employees, or slow down the business people signing partnership deals and doing M&A work.

I didn't do that. Instead, I hired a bunch of red team attackers. I bought them Metasploit Pro, WiFi attack hardware, and literally anything else they thought would be cool.

I also convinced the CEO to do something odd. (You know I'm an actual hypnotist, right? Perhaps you find yourself smirking just a little at that thought. Feels good, doesn't it?) I convinced her to let me pay the red team minimum wage. And I convinced her that if the red team was able to acquire certain key bits of intellectual property, source code, customer data, marketing plans, or financial data, that they'd be able to claim a bounty. That bounty would be the quarterly bonus of their victims. If the red team can get those flags, they stand to win big. They stand to claim quarterly bounties measured in the millions.

I announced this plan at the quarterly all hands. I heard some murmuring in the audience. When people went to the microphone to ask questions they seemed stunned, and asked for clarifications, if I was really serious, and if the CEO approved. It wasn't until the next morning that I faced the angry mobs. People from across the company were lined up outside my office.

The head of HR was first. He pointed to printouts of the graphs the security team gave him each month. These graphs showed that his team was the most likely team in the company to get phished. By a lot. And most likely to not use a password manager (meaning they were almost certainly using weak passwords, and reusing them in multiple corporate and external sites). And that they were least likely in the company to keep their systems patched and free of personal applications. He pointed out that HR in general, but Recruiting in particular, were sitting ducks. After all, he reminded me, what do the recruiters do all day? They receive emails from people they don't know with attachments purporting to be resumes. They click on all those purported PDFs and Word documents without question. And they click on any links that might either help them understand a candidate's history, or lead to a phishing or malware site.

He said the team was concerned that under those conditions, there was no hope for them keeping their bonus. If just one recruiter slipped up, everyone in recruiting would have to explain to their spouses why they were going to be getting less money this quarter. He started to raise his voice at me. “Have you seen the stats showing how bad companies are at patching? Have you seen the stats on the percentage of companies that are compromised by phishing? Have you even read the Verizon DBIR, Bob?!”

I reminded him of some recent decisions that he had approved. I reminded him that he had several important people on his team who had previously demanded to bring their own laptop from home and to connect it to the corporate network. I reminded him that he had personally overridden the security team concerns about malware, data loss, and other security issues. He was furious. “This is SERIOUS, Bob!” (I almost asked, but did not, “Wasn't it serious when it was just customer data on the line? What about you personally losing money made it more important? Never mind, I just answered my own question.” )

I asked him what he suggested.

“First, I don't know how you can sit there in good conscience with so little control over the laptops. Given the example of recruiting, why on earth would you let IT give them a general purpose computing device with admin rights? That's insane! Take those away and give them Chromebooks with hardware backed certificates, and lock down the network so they can't bring other machines. And don't let me catch you rolling these out without a full time VPN so Infosec has complete pcaps for inspection even if they are off the corporate network!”

I stammered a little. “W-What if something goes wrong with their laptop, or it's lost or stolen? You can't have people out of work over technology failure!”

“Haven't you been listening to me Bob? It's far more important that we do everything we can to protect our IP and customer data than it is to guarantee 100% uptime for every employee 100% of the time. If someone is out sick and didn't bring their laptop home, they won't work and we'll deal with that. If a remote employee has a laptop stolen, they can wait 24 hours until the new one gets shipped out. Bob, the risks of not doing these things are small. But they add up. And in ways that are hard to predict. Those small advantages are how the red team will get in. Either through clever hacking, or social engineering, that's how they'll get in. And I'm not going to have my team members lose income because you and IT gave them technology that was insecure by design.”

With that he stormed out of my office. But they weren't done with me yet. Next up was an engineer I barely recognized.

As he sat down he immediately said “Looks like you're going to be having a good day! Here's the deal. I do all the right things from a security perspective, but I'm pretty much alone in that regard on my little team. I have two problems. First, the way we do builds. Second, the way we do appsec. For builds, as you know, all developers have the entire code repo on their laptops. Now if source code is one of the red team targets, we're doomed. Bob, I make an OK living as a junior dev, but not so well that I can do without that bonus. Here's my pitch: You need to move development into the datacenter. Make it so there is never any source code outside of a secure enclave. It's simply got to be easier to manage the security of a central system with known inputs and outputs, right? Plus, get this: The other devs are going to like it. Why? Because they get to do their builds on a $30,000 machine rather than a $2,000 machine. And that machine is on a 10G network rather than on wifi, so pulling a fresh copy of the tree takes seconds. Now, if source code is stolen, there's no way you can take my bonus from me. It simply cannot be my fault. It's going to be the security team that loses their bonus! Sorry, but I care more about me getting my bonus than you getting yours. True fact.”

“OK, “ I said. What was your other concern? Appsec, was it?”

“Yeah. We say we do application security here, but you and I both know that's a joke. Some engineers take security seriously and do really great code reviews. But most have no idea what to look for. They didn't learn any security basics in college and we've done almost nothing to remedy that sad fact. So here's what I'm going to ask the VP of Engineering, so I'm asking you also. We need 2 full quarters to ramp up. We need to stop what we're doing and start security boot camp.”

“But, we have that,” I protested.

“What, the 1 day class that half the engineers opt out of? That's not what I mean. I mean a real boot camp where we not only learn secure coding practices, but spend time in labs learning to attack code. And you don't get permission to check in any more code until you pass both the offensive and defensive tests. Most won't pass the first time, and that's OK. Learning to think like a hacker is a major mindset change. Bob, you can't really learn to write code securely unless you've spend time acting as the attacker. And once you get into the groove, it's actually a ton of fun.”

“But that's not going to take 2 quarters. Maybe a month…”

“Yeah, but here's the problem. If any of our existing non-secure code is implicated in the attack path, some team, maybe MINE, will lose their bonus because of code we wrote over 2 years ago. That's not fair! If we're going to make things fair, we need not only time to become modestly proficient in secure coding, but we need to review any code that the red team might use against us. We need to make sure everyone understands static and dynamic code analysis, and how to do a proper code review. We have to find critical code modules that should only be touched by gurus. We need to assign each critical module to an appropriate owner. We need tooling that will prevent obviously broken code from going into production. And the list goes on. And all that is going to take time. I'm confident that we can make the code close to bulletproof, but it's a long way from there now and we need the time.”

“I'm not sure how I'm going to convince the Product teams to let you go dark for 2 quarters…” I warned.

“Those guys?!” he blurted out with snark that almost rivaled infosec engineer snark. “How long do you think it will take the red team to completely own them? I think they'll get religion in the next few days. Or maybe by lunch.”

Up next was a Director in the Finance team. He wasted no time. “So what are you going to do about the Finance people who move millions of corporate dollars around on the same machine they use for Facebook? Huh?! They all use old versions of Excel and Word on an old version of Windows. They love IE rather than Chrome or Firefox. Have you seen how many IE toolbars they have? How can those machines NOT be compromised?! It's a miracle we haven't sent millions of dollars to some overseas crime syndicate. And take a look at their workspaces. They have yellow stickies with bank passwords on them under keyboards. Does the red team get to come in late at night and go through people's desks? If so, we're going to lose our bonuses tonight! I have the team huddling today to come up with a list of improvements they need to make to secure employee and company data and money. I expect your full support in reducing the attack surface for my team and improved security training. Oh, and if you don't migrate us from this weak password-based authentication to something like a PKI-based hardware token, there's going to be hell to pay!” And off he went.

One of the IT managers came in and sat down. “Let's dump all our corporate machines.”

“Excuse me?”

“Yeah, we have all these machines. Some are in the building, some in our colo. Let's get rid of all of them.”

“And do what? No mail, no wiki, no HR apps…”

“We need to move to the cloud. We shouldn't have any internally hosted services. Move all our apps to cloud providers.”

“You told me once that the cloud wasn't secure…”

“That's before I knew we were actually going to be attacked!”

“You mean you didn't think the company was a valid target by disgruntled ex-employees, competitors, pranksters, hacktivists, crime syndicates, or nation states? That NO ONE would want to break in?”

“Well I guess it was a possibility, but now it's a certainty and we need to take action Bob! This is serious! Plus, think about it this way: I have exactly zero people looking at mail logs, applying patches, or anything else that might help further security. And remember when that cloud provider admitted last month that they had a breach based on their internal security tools? The way I figure it, that shows that they are actually doing security right. They had actually put effort into Detection, not just Prevention. And they clearly had also invested in the Respond and Recover functions of the NIST framework. And even if they need to do better, what they have is already much more than I'll ever be staffed to do. If we move to the cloud, we'll get continuous upgrades, better security, and my team can focus on much more strategic projects. So please assign someone from your team to make sure my team can make this change quickly and securely.”

Finally, the CEO came in. She confided in me that she just realized that her own bonus was on the line, and that it was a considerable sum of money. “Do you think the red team will come after me?” I told her that I honestly didn't know, but now that she mentioned it, probably. “It won't be hard for them, will it?” she asked. “Um, well, probably not”, I replied quietly.

“They won't come after my personal accounts. Right? Bob???”

I took a deep breath and explained that the red team was going to do what the bad guys do, and that it's common for the bad guys to do extensive research on targets, often lasting months, and to use personal information in furtherance of their attack of the company.

“That's hardly sporting of them!”

“Are you talking about the red team, or the criminals who want the data for which you are the top custodian?” She ignored the question.

“I heard you have a document on how to secure personal accounts so you never get hacked. Please forward a copy to me. Looks like I'm going to be up late tonight.”  And with that, she left.

And then I woke up.

That was the dream I had. Or maybe you can classify it as a nightmare. Either way, it's a useful thought exercise. I like this thought exercise because it fixes, in one stroke, the underlying problem we have today in security.

The core problem with security today isn't about technology. It's about misaligned incentives. We are trying to push security onto people, teams, and processes that just don't want it. The push model of security hasn't worked yet. If we want security, we need a pull model of security.

We need to align incentives so everyone demands security from the start and that we give them systems and networks that are secure by design. We need to have serious conversations about the relative priorities of customer data, employee preference, and perceptions of employee productivity. We need to be open about the hard economics and soft costs (like reputation) around the cost of a breach. And if it's cheaper to clean up after a breach than to prevent it, let's say so.

Short of the extreme “all red team, all the time” thought exercise, I don't have any easy answers. But I do have some suggestions that might help nudge the incentives in the right direction so maybe we can get a little more “pull”, allowing us to “push” a little less. I'll describe some of those thoughts in an upcoming blog post.

Have a story or a dream for me about about incentives that worked? Or went awry? Drop me a line on Twitter at @boblord. If you want to tell me confidentially, send me a DM! My settings allow you to DM me even if I don't follow you.