WTF Is Cloud Native, Security

Who Is Responsible for Cybersecurity?

Last year was a lot of things. Among them, 2021 was the year of cybersecurity threats. And this year seems on track to top it. The question is becoming not if, or when, but who. Who exactly is responsible for cybersecurity?

Part of the problem is that we’ve entered a world where zero trust and accountability butt heads with safe-to-fail culture aims, making it unclear who should ultimately be in charge of cybersecurity. Yes, there are clear roles at larger organisations, but then there’s the argument that everyone should own some responsibility.

Today we won’t necessarily answer this question, but we will give you techniques for how to foster a cybersecurity mindset across the different roles and functions. And we’ll give you at least one really big reason everybody should care a whole lot more.

Is remote work the problem?

We’ve already written about the ups and downs of hybrid work and remote work from a sociocultural perspective. But something we haven’t discussed is the security of it all—or lack thereof. Both contemporary ways of working are much more willy-nilly with device choice and connectivity. Employees at home may have sensitive documents on their laptops or printed out and then put in the recycling, and while full-time employees are probably working on company-owned computers they are also typically on less secure home and café Wi-Fi networks. Plus, with the lack of boundaries around working hours, they’re certainly connecting via their personal phones. And then there are the contractors and freelancers that are inherently BYOD.

Let’s face it, March 2020 all happened quite suddenly. We had to do the best we could to keep going—as much of the tech industry kept everything else going—but it meant that corners were cut as we rushed to work from home. We gave too much access to people too quickly and most hasn’t been reviewed since. And it shows.

Forrester and Tenable co-published the results from a survey of 1,300 security leaders called Beyond Boundaries: The Future of Cybersecurity in the New World of Work. They found that 72% of U.K. organisations attribute recent cyberattacks to vulnerabilities in technology put in place during the pandemic. These “business-impacting” attacks and compromises included loss of customer and employee personally identifiable information, interruption of day-to-day operations, ransomware payouts, financial loss or theft, and even intellectual property theft.

Another 68% suffered attacks that specifically targeted remote workers. That was mostly done by finding ways to compromise unpatched systems, but also it’s the simple lack of instant feedback that can only come with synchronous, co-located communication: Hey, you get that email? Should I open that attachment? These aren’t as natural to screenshot and write down in a Slack conversation. Human error and a typical lack of personal awareness of your risk vectors are normal.

So how can you create a culture of cybersecurity when you are struggling to create any remote culture at all? You start by communicating why you’re doing it.

Cybersecurity just got more urgent

Container Solutions colleague Anne Currie already wrote about how planning your response to the climate crisis is risk mitigation—as governments will eventually catch up to regulate and fine tech’s massive carbon footprint.

It’s quite the same with cybersecurity except a little further along. Last year, President Biden signed into intent an extensive executive order dedicated to improving U.S. cybersecurity. It doesn’t have the teeth of a law yet, but that should come very soon with acquisition policy—basically, the U.S. federal government won’t grant contracts to services that don’t meet certain cybersecurity standards. And contractors’ employees are even incentivized to whistleblow if they don’t believe their employer is meeting these exhaustive federal security requirements.

These requirements include security considerations across the entire software supply chain and aren’t just about software being built going forward but that which has already been built, released and integrated with.

There’s also cautious optimism that this regulation doesn’t seem like it will be done in the tech-illiterate vacuum that Congress often inhabits. At least, following the log4j vulnerability, the White House reached out to Apple, Google, Amazon and IBM for software security advice. Of course, it’s still unclear who is evaluating the security of software including proprietary versus open source versus third-party integrations. It’s also not clear if it will cover just the most recent version of software or all versions that may not even be maintained or patched.

Only time will tell, but for many reasons, a holistic, cross-organisational focus on cybersecurity is essential to risk mitigation.

We spoke to Leslie Weinstein, a major in the Army Reserves, concentrating on cyber-intelligence, and a specialist leader at Deloitte, focusing on the U.S. government and public services. She says the purpose of the executive order is to direct NIST—National Institute of Standards and Technology—to create software standards that would then be required to be met, retroactively.

“This is a kind of warning shot across the bow to industry: You know what you need to do here, go ahead and do it before we make this official, because, by the time we make it official, it’s going to be too late to go back and redo it from the beginning”, she said.

Who’s to blame?

When things go wrong, who actually is responsible for cybersecurity? If a developer accidentally opens a back door to attack vectors, is that a fireable offence? If an employee willfully neglects security—even if they don’t update their password quarterly or go for a clever Obama-style “123457”—are they at fault for a breach? When leadership fails to act on cybersecurity recommendations and a massive ransomware attack takes down whole health services, where do the fingers point? Or what if an employee “blindly executed a misguided loyalty” to their employer—as one of the convicted Volkswagen emission scandal developers claimed—will they be held responsible?

Before you start talking about developing a culture of cybersecurity, you have to be clear of consequences and accountability. And honestly, that’s just not clear at most organisations.

Cybersecurity is a cross-functional endeavour. Just like with sexual harassment and data privacy training, HR needs to get involved in rolling out cybersecurity compliance education programs. Organisations often have cybersecurity best practises built into their employment contracts.

Does the responsibility lie with whoever writes the code? “Developers don’t care about security—or at least they haven’t historically”, Weinstein said, not wrongly. The Linux Foundation’s 2020 FOSS Contributor Survey found only 3% of developers were interested in being responsible for security.

This creates constant friction where the CISO department is seen as a bottleneck to the under-pressure dev teams rushing to release faster and faster. Part of that is that software development teams need to get better at speaking the business consequences of their code. They have the domain knowledge and need to feel safe to speak up.

“If they’re being instructed to use a certain standard, potentially their senior managers may not be able to see the wrong that’s going on if they’re not doing it. So ultimately, everybody who’s been told to do it has that responsibility to do it. If they’ve not been told to do it that way, then they can’t be held responsible, because they wouldn’t know”, Weinstein said.

Ultimately, according to Weinstein, the buck stops with the c-suite. “It’s their responsibility to ensure that everybody in their company knows these standards, and why it’s important, so that if the developers don’t do what they’re being instructed to do, then it’s obviously the developer’s fault.”

The U.S. Department of Defense created the Cybersecurity Maturity Model Certification as a contract with vendors across the supply chain. It requires a c-suite person to put their name on it. That person is then called out if there are any complaints. It’s their responsibility to cultivate a cyber mindset, that it’s not just about press and compliance, but it’s about operations and revenue.

And this communication must be done continuously, not just once a year.

Everyone is accountable

After 15 years in the U.S. Army Reserve, Weinstein’s perspective on this is clear: We’re all in this together. “When you enlist in basic training, they tell you that everyone’s a safety officer. If you see something, say something. You’re responsible for everyone’s well being all the time. So, don’t let your battle buddy get fucked up because you don’t think that was your place” to speak up.

But everyone needs to better understand why to be motivated to comply. Weinstein gives the example of how we are forced to wear seatbelts and helmets as children, which at the time seemed like a nuisance because we didn’t understand their life-saving benefits until much later. “We need to educate employees not just what they need to do to be secure but why. Why do we have passwords? Why do we change our passwords? What happens if you don’t do it? There needs to be this culture shift from ‘Do it because I say’ or ‘because it’s a compliance requirement’, to explaining why we do it.” She continued that it’s important to explain that “circumventing these controls is dangerous. Here’s what happens when we don’t do this.”

Cybersecurity culture is about giving everyone a sense of ownership once they understand that urgency.

Of course, like all things in continuous delivery, there are still a few security best practises that all organisations should automate from the top-down, like enforcing multi-factor authentication. Once that is done, it’s about examining your risks at a company level, an environmental level—Are you all on-site? All remote? A mix?—and an industry level. Some organisations or departments zoom too far in on a specific compliance area—PCI-DSS, ISO, GDPR, SOC 2—but there is a lot of overlap that calls for a more holistic approach to an organisation’s risk management.

When in doubt, Weinstein says, “Follow the information. So, wherever the information flows, we need to make sure it’s protected.” This means you’re responsible for considering not just what you’re building but the third-party APIs you’re integrating with, the devices your colleagues are using at home and so much more.

The force of Murphy’s Law is strong in the cybersecurity space, so have a plan for when you’re hacked or attacked, and make sure it includes cross-organisational communication. One of the most important things is to be open in your discussion of what happened, how you responded, and what can be changed—and automated—in the future. A blameless postmortem is essential to answering the why things happened, not the who is to blame. Usually. While that reflection may be performed just among those involved, it’s important to publish the results of it widely throughout the organisation—cybersecurity is a continuous learning process and everyone should have access to those lessons.

With the expected updates to U.S. cyber requirements, “You’re responsible for whatever you’re using in your software. Whether it’s open-source code or it’s proprietary code, you’re still responsible for putting it into your software.” Weinstein continued that therefore, “you’re responsible for delivering secure products, doing your due diligence”, and being able to prove you have, even when things go wrong. That only happens with open communication.

Who’s asking?

We started with a question: Who is responsible for cybersecurity? But frankly, we don’t have the answer. Because like most things with cross-functional strategy and tech ethics, it’s often about the process of posing the right questions. And creating psychological safety that enables everyone to feel comfortable enough to answer them.

A final reminder from Weinstein: “Cybersecurity is everybody in every company’s responsibility, not just the people writing the code. Because our whole world is on cyber, everything that the company does is probably on a computer somewhere. So, one weak link can be the downfall for everybody.”

New call-to-action

Comments
Leave your Comment