Skip to content
  • Home
  • Resources
  • Blog
  • Human Vulnerability in Cybersecurity: Challenges & Opportunities

Human Vulnerability in Cybersecurity: Challenges & Opportunities

The human element of cybersecurity is simultaneously the most effective tool in your risk management toolbelt and the most complex to manage. Your staff can be your most robust early warning system, or they can be the entry point for a major cyberattack. How you manage your staff and their approach to cybersecurity dictates your organization’s resilience. 

In this article, we’ll address some challenges and opportunities that the human element of cybersecurity presents. Before that, we’ll cover why your staff is essential to organizational resilience. The article will conclude by pulling together prior discussion points into a cohesive set of recommendations. 

Why Humans Matter

Some cybersecurity practitioners approach security safeguards with the fallacious belief that implementing defense-in-depth security infrastructure is totally sufficient to protect an organization. Thankfully, that mindset is slowly being phased out because the data refutes it. 

The global cybercrime market is projected to cost businesses $10.5 trillion in 2025, with income to cybercriminals being a large portion of that. Cybercriminals are very financially motivated to grow their criminal enterprises and wreak havoc. 

Humans are the number one attack vector for cyberattacks. They’re especially susceptible to social engineering attacks, which are tricks that play on human psychology to gain access to an environment to execute an attack. 

Technology alone is insufficient to address this most critical vulnerability. 

Organizations must address the human element of cybersecurity. Failing to do so is downright reckless in 2023. Organizations must do that effectively, which is a much trickier proposition.

Challenges in Dealing with Human Vulnerabilities

If dealing with human vulnerabilities was easy, it wouldn’t be a multi-billion dollar industry projected to grow ten times in size in the next four years. The reason for that difficulty is how human vulnerabilities are exploited.

Technical exploits tend to be primarily binary: if the conditions for the exploit exist, then the exploit will occur. If not, the exploit won’t. That principle also accounts for things like detection and response solutions and other mitigating infrastructure. If an attack can overcome those, then it will be successful. If not, then it won’t. 

The primary issue with attacks is overcoming the attack surface boundary. Most organizations substantially harden their attack surface. If an attacker can’t get in, then it won’t wreak havoc. 

Humans pose a unique gap in the attack surface. They have complex needs and motivations. They are susceptible to greed and blackmail. Some people have an innate desire to help, regardless of the apparent legitimacy of the request. People are susceptible to an instinctual fight or flight mechanism. All of the above contribute to what makes human vulnerabilities so challenging to mitigate. 

Social engineering attacks take advantage of the above factors. They’ve been developed over hundreds of years—confidence tricks and the con artists who take advantage of them substantially predate modern cybersecurity. 

Unfortunately, modern cybersecurity doesn’t have a magic solution for those issues. The answer hasn’t changed much throughout history: make people aware of confidence tricks leveraged in social engineering and teach them to be savvy against them. While that training is the most effective solution, it doesn’t eliminate the risk of attack; it just mitigates some of it. 

That training also suffers from another issue: situational awareness. Some people can look at abstract concepts and extrapolate from those to apply concepts in novel ways. For example, some people can hear about social engineering and its different approaches in the corporate environment and apply that situational awareness to their everyday work. 

Other people cannot and must have specific situations explained to them. For example, while phishing and vishing are very similar and differ only based on the delivery of the message (email versus voice), some people may need explicit training on both. 

Those differences in educational approach are significant and make selecting appropriate and effective training difficult. Add to that educational issues based on neuroatypicality, and the considerations for effective security awareness training become substantial. 

The same considerations apply to the technical components of cyberattacks. System compromise may be opaque primarily to end-users. If end-users see anything, it may be the inability to access files or specific resources, system slowness, or other odd behavior. 

Some people may write that off as “technology problems.” It’s crucial to educate thoroughly on what’s a technology problem versus a cyberattack. It’s also essential to educate staff not to be dismissive of those issues, to take them seriously, and not assume that someone else is going to report the problem. 

Opportunities in Dealing with Human Vulnerabilities

People with general situational awareness may generally understand social engineering attacks but miss specific triggers. People who need that additional targeted training may miss general social engineering attacks but spot those specific triggers. 

In that way, people can provide substantial defense in depth with respect to social engineering attacks. That needs to be cultivated, however. For most organizations, it requires a complete shift in the approach to external communications.

One of the main shifts is investing in a training program that caters to differences in educational approach. Security training needs to be effective. The difference between effective and ineffective training can be successfully mitigating an attack or suffering millions of dollars in easily preventable damage. 

That damage isn’t speculative: major cyberattacks happen daily. The approach to being attacked should be a matter of “when” and not “if.” Over time, the possibility of being hit by a cyberattack is guaranteed. 

Training efficacy is driven by accounting for different learning styles and the impacts of neuroatypicality. Phishing education can be textual, visually compelling, and engaging. The more options and variety, the more effective the training for more staff. 

While that training is suitable for developing the tools in staff to address vulnerabilities, there also needs to be a free exchange of information. As highlighted above, people learn in different ways and are observant in different ways. To maximize protection, organizations should build a defense in depth through the exchange of information.

That’s largely done through building a culture of security awareness. Inform people of the consequences of cyberattacks. Make sure they understand the importance of mitigating cyberattacks. Even better: incentivize the reporting and mitigation of cyberattacks. Give people a personal stake in exchanging information and improving their responsiveness. 

One way of doing that is by creating cyber-awareness competitions. There are a few flavors of this: phishing competitions where the best department wins, bug bounties, and other incentives for successful mitigation of issues. 

By creating a culture of cyber-awareness and promoting good behavior, people will be incentivized to share information with peers. Add to that the concept of “gut checks” or other information sharing for potential threats supplements that perfectly by encouraging the identification of threats early and often.

That identification and awareness should be supplemented by freely available escalation pathways. Staff needs to be empowered to report issues and find support for those reported issues. If staff is disincentivized from reporting potential threats or if there are other forms of friction introduced into the reporting process, then staff will be less apt to report threats. 

Supporting staff in their endeavors is also key. Some reports will be false positives. It can be difficult spotting legitimate business email compromise, for example, versus a new staff member for a vendor who’s uncertain of preexisting processes. So too, can it be difficult spotting a legitimate social engineering attack to gain credentials versus someone who legitimately forgot their credentials and the information needed to verify their identity? 

If staff is punished for reporting false positives, then they will err on the side of caution and not report. If you want staff to alert you to threats, then you have to accept that there will be false positives. Unfortunately, while humans aren’t binary, the effects of reinforcement (and especially negative reinforcement) are. Where you use reinforcement, make sure it’s appropriate for the goals you want to achieve. 

Recommendations for Effective Human Vulnerability Management

Based on the foregoing discussion, there are ways you can overcome challenges and leverage opportunities to create truly defensive in-depth human vulnerability management. Here are some straightforward recommendations for how to do that:

  • Operating as if being attacked is a foregone conclusion. It is, so you should accept that reality, which will make for a more robust security program. Your emphasis needs to be ending threats quickly and cost-effectively, not being hit with an attack that will cost your organization millions. 

  • Play to people’s strengths: invest in a security training platform that provides many forms of engaging training. 

  • Understand that not everyone can catch everything: you need multiple checks in place for social engineering threats.

  • Cultivate a culture of awareness: make sure people understand what security awareness is and that they don’t need to operate in isolation.

  • Emphasize collaboration: more threats are more actively identified when people compare notes. Make sure your staff feels comfortable sharing information and validating whether something is a threat or not. 

  • Escalation should be comfortable and easy: the more barriers you put in the way of security incident escalation, the less likely security incidents will be reported. Effective controls are functional controls, and in the human space, that means removing reporting barriers. 

  • Incentivize threat identification: reinforcement is a powerful tool and can be used to shape desired behavior as well as eliminate undesirable behavior. The more you do to incentivize accurately searching for threats, the greater likelihood that threats will be identified and eliminated. 

  • Empower staff: your organization’s human attack surface is broad, and anyone can be impacted by a cyber threat. Make sure everyone is empowered to act on those threats. There’s nothing that compromises the efficacy of your cybersecurity program than preventing a line of defense from defending. 

  • Punitive measures should be reserved for egregious behavior. Some people won’t adopt a culture of security. Even worse, they may actively act as insider threats. In those cases, negative reinforcement should be leveraged to disincentivize the behavior. 

That list isn’t comprehensive but should give you an idea of how to shape a cybersecurity awareness program to build a culture of awareness and harden your organization from social engineering and other human susceptible attacks. 

Those safeguards will pay dividends elsewhere. Your technical security infrastructure may miss threats that people will pick up. Files may become inaccessible, the network or computers will run slowly, or people will see other atypical behavior. Where your systems may not pick up a threat, people will. 

By identifying problematic behavior and encouraging reporting, your staff can be a great early warning system. They will adaptably spot issues quickly and address them effectively if empowered to do so. 

Conclusion

Investing in your staff and building a culture of security awareness can pay dividends by averting costly and debilitating cyberattacks. Doing so requires acknowledging the challenges of working with people. They’re susceptible to attack and difficult to train effectively. 

However, if you look at those challenges as an opportunity to build a highly knowledgeable and adaptable defense-in-depth early warning system, your staff can be an excellent first line of defense. 

The best way to do that is to educate your staff about cyberattacks and social engineering schemes, empower them to discuss and report, and encourage the identification of incidents. Your staff may lack the accuracy of your technical infrastructure, but they’re significantly more adaptable and will address threats in a way your technical infrastructure cannot. That will mitigate crippling attacks. In my opinion, there’s no better way to do so.