Unless your company is in the security business, it’s likely many employees view cybersecurity as someone else’s job.
And strictly speaking, they’re probably right. They have their own tasks and passions, and if you’re not in the field (or working to attack companies) you probably don’t like to think about it.
With attacks going steadily up, more of our personal data being regularly breached (I assume most readers have received multiple notices of exposure), ransomware attacks shutting down hospitals, colleges, hotels, oil companies, utilities, government services and more, it’s an undeniable cost of living in our increasingly convenient, digital world.
The numbers are increasingly hard to wrap our minds around, too. The US Deputy National Security Advisor for Cyber and Emerging Tech Anne Neuberger put the estimate for the annual average cost of cybercrime at $23 trillion worldwide by 2027.
Okay, fine, that’s very troubling, but you may now be thinking: why does that involve everyone at every organization? Aren’t you exaggerating?
No.
Because 95% of all breaches are due to human error. A recent survey in the UK found that for large organizations, 91% experienced phishing and 80% dealt with impersonation.
In terms of disruptions of scale, 61% identified phishing as the source.
Compare all this to the 17% of businesses who saw attacks originate through digital vulnerabilities, from viruses, or by malware.
In other words: everyone at your organization is getting attacked.
And with phishing attempts (up a staggering 1,265% in 2023) radically improved by GenAI (websites, copy, translation, imagery, voice and video deepfakes, etc.) and the increasingly wide-scale availability of personal data, the attacks are getting harder and harder to separate from expected communications.
In today’s newsletter, I want to consider more this challenge: how do we do more than just shore up the world’s top cybersecurity risk (us)? How can we actually build a communal defense—a culture of security that’s incentivized, practiced, and effective?
Why We’re All Being Targeted
Social engineering attacks are geared at breaking into our systems, and often employ highly sophisticated means, like the multi-stage malware examples covered in this PTP Report.
But all that starts once they’re in.
At their core, these are often some of the oldest tricks that exist, built around fooling people with impersonation, taking advantage of workplace habits, and playing on our psychologies to get us to act without proper caution.
Consider these psychological targets:
- Greed/Curiosity: Maybe the easiest approaches to sniff out at work, many cybersecurity attacks aim to get us to gamble on authenticity in exchange for personal gain, or by playing on our desire to get ahead or get the answer to a question.
- Fear and Urgency: More effective in the workplace are scams that work as emergencies. We’ve all seen popups that tell us our machine is infected, but harder to suss out are phone calls, for example, (supposedly) from our own IT department, such as in response to a spam email we’ve just received. Aiming for our fight flight freeze response, many cyber attackers hope to force us to short-circuit our logic and caution to protect either ourselves or the company we work for.
- Authority: Everyone’s had a coworker ask them to bypass standard procedure, either for a special case, emergency, or big opportunity. And with a capacity to impersonate people with greater accuracy, more of these attacks are being launched than ever before, in the hope of getting us to trust the expertise or leadership of people we know.
The nature of hybrid work and distributed teams, of course, makes this easier to pull off, as we often work with colleagues we’ve never met in person, or at the very least mostly deal with through digital forums. For people out of direct, physical contact, quick verification can be harder to get than ever, one challenge of remote work security.
Add to this the interconnection of systems, and it’s essential that everyone on a team understands that attackers don’t need to breach an administrator’s system to get the access they need.
Any point of access at a company can be enough to linger and steal credentials, eavesdrop, or move laterally to get to more protected resources.
Understanding all of this is the most basic kind of social engineering awareness a workforce needs.
Culture Is More Than Just Cybersecurity Training
Most employees who access a computer for work get some form of cybersecurity training, often at onboarding. These programs sometimes include annual signoffs, for example, and aim to increase overall cybersecurity awareness.
Any good employee cybersecurity education program should:
- Increase awareness of terms and risks, making clear the role of human error in cybersecurity incidents
- Encourage workers taking accountability for their own accounts
- Include a response plan (ideally immediate) in the increasingly likely event of an attack
- Include contact and risk handling information
- Encourage ongoing vigilance and collaboration
Any such programs at this point are still likely lagging behind the sophistication of attacks we’re seeing as we move into 2025.
For all of this to truly work, in my experience, it has to be followed up with practice.
Like the chaos engineering experiments run at Netflix and other companies, cybersecurity training should include regular, spontaneous attacks, that come without warning. I’ve written this many times, but little brings home the practice of data breach prevention like being directly caught out.
And of course, such catching is best done in-house, and constructively. It can also include gamified learning and interactivity, and above all should be regular, and result in positive education and increased employee cybersecurity responsibility.
The best training programs should also be tailored to roles. Accounting employees, for example, may receive fake financial reports with embedded malware, marketing to be led to a watering-hole attack, while engineers prompted to look at infected code.
I think it’s not idealistic to suggest that every employee should know the cybersecurity incident response plan: where it is located, and whom exactly to contact in the event of a suspected incident. They shouldn’t have to take time to look things up, or fear repercussions, as you want them to have no qualms about reporting it immediately.
Building a security culture includes incentives rewarding cybersecurity awareness, open communication, and the sharing of information.
Roundups of actual attacks you and others in your industry are facing, for example, can go a long way in giving employees the awareness they need to deal with real-world threats (some which they may not have previously suspected as likely or possible).
[PTP, for example, publishes a bi-monthly cybersecurity roundup, and though our coverage is broader than I’m recommending here, such reporting can help with keeping up awareness of risks and trends.]
Just this week I read an example of the power of such information sharing in action. Prequel, a software startup by Tony Meehan and Lyndon Brown (both of whom had a security background with the NSA), aims to use the developer community to help proactively prevent software bugs.
Working from a database of Linux workload misconfigurations and failures, fed by a large community exchanging information, their software looks for issues pre-emptively in a company’s own configuration.
While this example is focused on a third-party solution and bug squashing, this technique can be applied to cyberattack risk-sharing in-house, with reporting even potentially tied to cybersecurity incentives for employees.
What’s the Bottom Line?
For leadership, I can imagine the thinking here quickly gets to cost. Everyone agrees robust training programs are fantastic, but how do you earmark more money for these when the best they can do is show no results at all?
Proving a negative can be challenging, but many estimates show hands-on phishing awareness training, for example, can lead to a reduction of 50% in successful phishing attacks, while others point to the reduction of end data breach costs (again, measured when they happen, in comparison).
But compared with other cybersecurity measures, regular, quality training with effective follow-up can be an incredibly cost-effective way to reduce your vulnerability and the damages from breaches and ransomware.
There are also free resources that companies can leverage. CSO ran a great list of some of these in November:
- CISA has many trainings and exercises, some free for public use.
- The NIST cybersecurity division publishes its own list: Free and Low Cost Online Cybersecurity Learning Content.
- The SANS institute, Cyber101, and SC Training all offer free cybersecurity best practices trainings.
- Gophish is one open-source option for simulating test phishing attacks.
Ultimately, your employees should both be aware of the human factor in cybersecurity and feel comfortable sharing concerns and asking questions.
They should be rewarded, instead of penalized, for speaking up. This can accompany tying broader company security goals, for example, to KPIs and offer potential financial rewards, which, when proven, provide organization-wide incentive.
Where PTP Can Help
Cybersecurity is one of the core pillars of what we do at Peterson Technology Partners, and that often means helping organizations staff those hard-to-fill cybersecurity positions (sometimes niche) with quality professionals.
We’ve also helped companies in a multitude of ways over the years, both onsite and off, by securing their in-house systems, as well as designing custom solutions. We have a wealth of experience with upskilling and training and can also help with ongoing support, for assessments of your security posture and responses to evolving threats.
I’ve written before about SRE and the importance of having multiple layers in your defense, but arguably the most important of those layers is made of the human beings at all levels who make companies what they are.
Conclusion
I didn’t even get to regulations (though I wrote about them earlier this year), and with the increasing volume of attack, governments everywhere are putting more and more responsibility on employers.
This means accountability, as well as being able to demonstrate one’s defensive posture in audits or annual statements. And if you’re in a field that deals with protected information protection (such as HIPAA), this regulation can be mandated in ways that require specific training and can levy heavy penalties for failure to comply.
Regardless, the numbers tell the story: the point of attack for most of the cybercrime we’re dealing with today is people, not systems.
The good news is that this is also an incredible opportunity. With better insight, training, and incentives to be part of the solution, we can turn this vulnerability into a strong point, just as the most secure organizations in the world continue to do.
References
Cyber Security Breaches Survey 2024, UK Gov
Prequel is building a community-driven approach to finding software bugs, TechCrunch
Security awareness training: Topics, best practices, costs, free options, CSO