Skip to content

Does Security Awareness Training Work? The ROI Research

Security awareness training effectiveness - chart showing improvement metrics

“Does this actually work?”

Every CISO asking for budget, every HR leader evaluating vendors, every CFO signing the purchase order lands on the same question. Security awareness training eats time, attention, and money. What does the organization get back?

We dug through the research. The answer is messier than vendors want you to believe.

Yes. Security awareness training works. But most of it doesn’t.

The research draws a hard line between active, simulation-based training and the passive stuff (videos, slideshows, annual compliance modules). Active training produces real behavior change. Passive training produces completion certificates and not much else.

This isn’t a subtle difference. Organizations running simulation-based programs see phishing susceptibility drop 50-80%. Organizations relying on passive content see single-digit improvements that evaporate within weeks. Same budget line item. Wildly different outcomes.

The most surprising thing in the data isn’t that training works. It’s how badly most organizations are doing it.

The studies worth paying attention to aren’t surveys about whether people “feel more aware.” They tracked what employees actually did when a phishing email showed up.

Aberdeen followed 300+ organizations over two years, comparing those with formal security awareness programs against those without.

The results were stark. Organizations with training experienced 70% fewer security incidents and averaged 5x ROI on program cost. But the fine print matters: the biggest gains came from phishing simulations, not video content. Organizations running video-only programs barely outperformed those with no program at all.

Ponemon surveyed 1,200 IT and security professionals about training effectiveness and the results split cleanly by training type.

Interactive simulations: 72% reported measurable improvement. Gamified training: 68%. Traditional e-learning: 23%.

The gap gets worse over time. Annual-only training showed no lasting behavior change regardless of format. Generic content without role customization underperformed customized programs by 40%. If you’re buying off-the-shelf content and running it once a year, you’re essentially donating money.

The National Institute of Standards and Technology published phishing susceptibility data across federal agencies. Before training, the average click rate on simulated phishing was 33%. The average report rate was 11%.

After 12 months of simulation-based training, click rates dropped to 4% and report rates climbed to 67%.

Here’s the number that should bother every compliance-focused organization: agencies using only compliance-based training saw click rates drop to 28% and recover to 31% within three months. They spent the money. They checked the box. They got almost nothing back.

Carnegie Mellon’s research on psychological safety found that blame-free environments produced 3x higher incident reporting. IBM’s 2023 Cost of a Data Breach report confirmed the financial side: 74% of breaches involve the human element, and organizations with trained workforces save an average of $232,867 per incident.

Five sources, not fifty. But they point in the same direction, and the effect sizes are large enough to take seriously.

The research keeps circling back to the same problem: knowledge and behavior are not the same thing.

Employees who complete video-based training can ace a quiz about phishing indicators. But when a well-crafted phishing email lands during a stressful Tuesday afternoon, quiz knowledge doesn’t activate. The employee is thinking about a deadline, not about training they watched three months ago.

Recognizing a phishing email under pressure is a skill, like recognizing a counterfeit bill or spotting a pickpocket. Skills require repetition to develop. You wouldn’t hand someone a pamphlet about swimming and push them into the ocean. But that’s essentially what annual video training does for phishing defense.

There’s also the attention problem. Video completion rates look great because employees click through while doing other work. The LMS says they completed the module. Their brain was in a different meeting.

And memory decay is brutal. Without reinforcement, training content fades within 30-90 days. Annual training creates a brief spike of awareness followed by 11 months of open season.

What actually improves employee security behavior

Section titled “What actually improves employee security behavior”

Every major study lands here. Phishing simulation is the single most effective intervention. Not because it’s fancy, but because it creates real practice recognizing real-looking threats with immediate feedback when you get it wrong.

Monthly simulations at minimum. Quarterly is not enough. Progressive difficulty as employees improve. Non-punitive feedback at the moment of failure. And critically, track reporting metrics, not just click avoidance. An employee who doesn’t click but also doesn’t report has only protected themselves. An employee who reports protects the whole organization. More on this in our cybersecurity exercises breakdown.

Role-based customization (40% better outcomes)

Section titled “Role-based customization (40% better outcomes)”

Generic training wastes everyone’s time. Your finance team needs BEC recognition and wire transfer verification scenarios. Your executives need whaling awareness and authority exploitation recognition. Your IT staff need social engineering defense and privilege management practice.

Ponemon’s data showed customized content outperforms generic by 40%. That’s not marginal. If you’re running the same security activities for the marketing intern and the CFO, you’re leaving most of your risk unaddressed.

Even good training decays. The research shows awareness returns to baseline within 90 days without reinforcement. A single annual event, no matter how well designed, is a temporary fix.

Effective programs layer touchpoints across the year. Monthly phishing simulations. Short weekly security reminders. Quarterly scenario-based exercises. Annual comprehensive refreshers. This isn’t about more training hours. It’s about keeping the neural pathways active. Our security awareness training guide covers implementation cadence in detail.

This one surprised us. Organizations that punish employees for failing simulations actually get worse outcomes over time. Employees learn to hide mistakes instead of reporting them. They share simulation warnings with each other. They game the system.

Carnegie Mellon found that blame-free environments produced 3x higher incident reporting. Since early detection limits breach damage, a human firewall culture that encourages reporting directly reduces the cost of security incidents.

How do you calculate security training ROI?

Section titled “How do you calculate security training ROI?”

This is the section that matters when you’re sitting across from the CFO. Skip the abstractions. Here are the numbers.

IBM’s 2023 Cost of a Data Breach report is the standard reference. Average breach cost: $4.45 million. Breaches involving the human element: 74%. Average cost reduction with a trained workforce: $232,867.

For a typical enterprise program with simulation capabilities:

Per-employee annual cost runs $15-50, depending on platform and features. Administrative time is 2-4 hours monthly for program management. Employee time is 2-4 hours annually for training completion.

Annual training investment: $15,000-50,000. Expected breach probability reduction: 50-70%. Expected cost avoidance based on breach probability and cost data: $1.6-2.3 million.

That’s 30-150x return on investment. But only if the program includes active elements like phishing simulation. Passive-only programs don’t improve behavior enough to justify the cost. If your current vendor can’t show you behavior change data (not completion data), you might be in the passive category without realizing it.

Training designed to check regulatory boxes without changing behavior produces high completion rates (the only metric anyone tracks), no measurable behavior change, false confidence, and continued vulnerability to basic attacks. These programs exist to satisfy auditors, not to protect organizations.

“Three strikes and you’re fired” policies for simulation failures look tough on paper. In practice, they produce reduced reporting of real incidents, gaming behavior, resentment toward the security function, and zero improvement in actual threat recognition. You’re training employees to fear the security team, not the attackers.

The research is unanimous here. Annual training produces a temporary awareness spike that decays within weeks. Organizations running annual programs see near-zero sustained improvement. It’s the security equivalent of going to the gym every January 2nd.

Training focused on how attacks work technically (packet inspection, malware analysis) misses the point entirely. Employees need to recognize threats, not reverse-engineer them. Save the technical deep dives for the security team.

What metrics should you track for training effectiveness?

Section titled “What metrics should you track for training effectiveness?”
MetricPoorAverageGoodExcellent
Phishing click rate>25%15-25%5-15%<5%
Report rate<10%10-30%30-60%>60%
Time to report>24h4-24h1-4h<1h

Trend direction matters more than any single snapshot. A 20% click rate that’s been dropping steadily for six months tells a better story than a 10% rate that’s been climbing. Also track variance between departments, response to new attack types, and overall security incident volume.

Training completion rate measures compliance, not effectiveness. Quiz scores measure recall, not recognition under pressure. Satisfaction surveys tell you employees liked the training, not that it worked. If these are the only numbers your vendor reports, ask harder questions.

Run a baseline phishing simulation before any training. You need a starting point, or you’ll never prove improvement. Then implement monthly simulations with immediate feedback, add role-specific content for your highest-risk groups, and track click rates plus report rates monthly. Aim for 50% improvement in year one. If your vendor can’t commit to that target, explore free training options as a starting point and upgrade once you have budget justification.

If your current program isn’t delivering

Section titled “If your current program isn’t delivering”

The likely problem is one of three things: too passive, too infrequent, or too generic. Audit your current program against the research benchmarks above. If you’re missing a simulation component, add one. If you’re running quarterly or annual, increase to monthly. If everyone gets the same content regardless of role, customize.

When requesting investment, frame everything around breach probability reduction, not compliance checking. Lead with behavior change metrics, not completion rates. Bring peer organization benchmarks. Compare cost per employee to breach cost exposure. The CFO doesn’t care about awareness. The CFO cares about risk-adjusted cost.

Most security awareness training programs fail. Not because the concept is flawed, but because the execution is lazy. Video modules and annual quizzes exist to satisfy a compliance requirement, not to change how people behave when a phishing email hits their inbox.

The programs that work share three traits: they create practice instead of lectures, they run continuously instead of annually, and they customize instead of genericize. The research on this is not ambiguous.

If your training program is producing completion certificates but your phishing click rates haven’t budged, you don’t have a training problem. You have a training program problem. The fix isn’t more of the same. It’s a fundamentally different approach.


See the difference between passive content and active training. Try our interactive security exercises and experience simulation-based learning firsthand.