Extending into a security awareness programme

The purple team session is the beginning, not the programme. What it produces is a group of people who have felt both sides of a phishing attempt and who will, for some time afterwards, look at their inbox differently. The programme’s job is to keep that instinct alive, extend it to the threats the session did not cover, and do both in ways that remain grounded in what attackers are actually doing.

Realism is not a feature of this programme. It is the principle the programme is built on. A simulation that does not reflect current techniques does not train current recognition. A session that teaches abstract warning signs does not produce the same response as one where the participant experienced the warning sign being crafted in real time by a colleague who wanted it to be invisible. Every extension of the programme should ask the same question the phishing session asks: is this what a real attack looks like, right now?

The monthly tool test

The month after the session, participants receive a simulated phishing email. Every month after that, so does everyone else.

The framing matters. Before each monthly simulation, a brief message goes to all staff: this month we are testing our phishing reporting tool. You may receive a test email. If you see something suspicious, use the Report button. That is all the notice they get.

This framing is honest, which is why it works. Staff are not being told a test is coming in order to prime them to pass it. They are being told because the exercise is about the tool and the habit, not about catching people out. The effect is that people are alert to their inbox in a way that is exactly the alertness a real phishing campaign would benefit from encountering. The monthly announcement is itself part of the training.

What the test email uses is not announced. The technique is selected from current threat intelligence each month, tested against live defences, and deployed only if it lands cleanly in a real inbox without modification. A QR code campaign one month, a supplier impersonation the next, a legitimate service redirect after that. The variation is deliberate: recognition is a collection of pattern responses and each pattern needs its own exposure under realistic conditions.

Results go to the security function and, in aggregate by department, to management. They do not circulate as individual scores. The metric that matters is not the click rate in isolation but the reporting rate alongside it, and particularly the time between delivery and first report. A population that reports quickly is more valuable than one that simply does not click, because the same habit that drives reporting also drives calling the director to verify an unusual request before acting on it.

Cohorts and cadence

The afternoon session runs once per cohort. Cohorts follow department lines because the scenarios need to feel plausible to the people in the room. The impersonation that lands for the finance team does not land the same way for the night shift, and vice versa. Madame Zara’s behavioural support team will notice different signals than Kevin’s IT function. Running mixed seniority within a department is fine. Running mixed departments is not.

Monthly simulations run for the full staff population simultaneously. This ensures that a technique observed in the wild this month is tested across the organisation this month, not in the cohort rotation order. The session and the monthly simulation are separate programmes serving different purposes: the session builds the instinct, the simulation sustains it.

Volunteers receive the essentials at induction: what phishing looks like, what to do, who to contact. The detail is in the volunteer awareness section.

Further ideas for the programme

What follows are extensions that share the same philosophy as the phishing session but require their own setup. None of them are off-the-shelf. All of them work for the same reason the phishing session works: participants experience the attack from the attacker’s side before they are asked to defend against it.

The vishing session takes the phishing session’s format and moves it to the phone. The threat intelligence component covers what vishing calls targeting the sector currently sound like, the pretexts being used, and the social engineering techniques that make them effective. Participants then make the calls rather than just receiving a briefing about them. A small group takes turns calling a designated number answered by a colleague playing a staff member, working from a scenario and trying to extract a credential reset, a payment action, or physical access. The display equivalent is a shared note visible to the room logging what each caller tried and what response they got. The debrief is identical in structure to the phishing debrief: what worked, what gave it away, what is the correct response regardless of how convincing it was.

The data handling exercise uses a realistic-looking document delivered by an unexpected route to test how staff respond to sensitive material arriving from an unfamiliar source. A plausible-looking confidential file arrives by email from an address that is not in the directory, a shared link from a service the organisation does not officially use, or a USB drive left in the meeting room. Participants decide what to do with it. The exercise surfaces the gap between knowing the policy and acting on it when the document looks genuinely important and nobody is obviously watching.

The pretexting walkthrough is simpler to run than it sounds. A facilitator, or a colleague from another department, approaches the front desk or the main working area claiming a plausible false identity: a contractor sent by Fabulist Systems to check the server room, an auditor from the Consortium, a new volunteer on their first day who just needs someone to let them into the east wing. The exercise is announced in advance as “we are reviewing our visitor and access procedures” without specifying when or how. Participants who are approached are asked, afterwards, how they responded and why. The debrief covers physical access control with the same specificity that the phishing debrief covers email recognition: not abstract principles, but what happened in this building today.

All three of these require preparation, a scenario grounded in current real-world techniques, and a debrief that connects the exercise to practical behaviour. That is the reverse psychology template. The subject changes. The principle does not.