
Why People Still Fall for Cyber Scams
“The brain is not designed to easily discover truth, but to easily jump to conclusions that are good enough.”
Introduction
A “short pause” can be the difference between a click and a hack
In our line of work, we help companies reduce their staff’s phishing click rates and change their cyber culture from one of awareness to one of vigilance. We do this by presenting course material and learning solutions to tens of thousands of people in online sessions. One comment made by delegates has always remained remarkably consistent.
Everyone, irrespective of acumen or experience, has expressed that they held the belief that they would “spot a scam”, until they didn’t.
The reason for this is simple. The brain is not designed to easily discover truth, but to easily jump to conclusions that are good enough to keep us safe, conserve energy and allow us to function in busy, complex environments.
Ourintellectual state optimises this jump while our experience pre-empts it, and the only way to force the brain out of an idle mode and into a vigilant analysis mode is to physically and mentally stop, and pause, long enough to think.
As Daniel Kahneman explains in “Thinking, Fast and Slow”, fast thinking relies on automatic, habit-driven decisions that allow us to act quickly with minimal effort. We rely on these habits to help us save energy, by reusing solutions that have worked before, rather than analysing the same problem again.
Slow thinking, responsible for deliberate analysis and calculation, can step in when something feels unfamiliar, when a situation does not quite fit, or when old solutions no longer work. However, because habits are efficient and comfortable, slow thinking often accepts the conclusions of fast thinking rather than intervening, even when the environment or context has changed.
Deepfakes, fake websites, phishing and social engineering scams exploit our psychological make-up, but not the gaps in our intelligence or experience. While experience can be a powerful defence against manipulation, and often relied upon in compliance training, it can still be bypassed when familiar tricks are repackaged in new and convincing ways.
The evolution of everything, but our thinking
AI has industrialised deception, enabling attacks to evolve through the coordinated use of multiple methods and media across almost any platform.
From flawless grammar in realistic, urgent emails to highly convincing fake videos and images, modern attacks feel familiar and believable. They exploit emotional and mental shortcuts in a world where our attention is constantly pulled across devices, apps and platforms, making it easier for these threats to slip through unnoticed.
Imagine being shown a simple booby trap laid out on a table.
A wire stretched between two trees, a concealed pit, a crude trigger mechanism. With time and no pressure, it is obvious. You know exactly where the danger is and how to avoid it.
Now place that same trap in a forest.
You are moving quickly. Your heart rate is up. A bear is charging behind you. Your focus is on escape, not inspection. In that moment, you are not scanning the ground for thin wires or subtle signs. You are looking for the fastest, safest path forward. The trap does not need to be clever. It just needs to be there.
Social engineering scams and phishing emails work in the same way.
In a classroom, a training module, or a policy document, phishing signs are easy to spot. Poor grammar, suspicious links, unusual requests. Out of context and without pressure, they stand out clearly.
But in the real world, people are busy. Inboxes are full. Messages arrive alongside genuine requests, meetings are about to start, and decisions need to be made quickly. Add urgency, authority, or familiarity, and attention narrows even further. The “bear” might be a deadline, a senior executive request, or a financial issue that feels time critical.
Attackers do not rely on people being unaware. They rely on people being distracted, pressured, and focused on something else. Just like the booby trap in the forest, the danger is missed not because it is invisible, but because the moment does not allow for careful inspection and the trap is very well camouflaged.
This is why awareness alone is not enough. The risk appears not in isolation, but in motion, under pressure, and in the middle of real work.
Recent research and a practical solution
A study published in 2023 states that people who slow down and think analytically are significantly better at spotting fraudulent websites and online scams than those who rely on quick, intuitive judgement, and made fewer errors overall.
This effect is the best when people are not under pressure, and when they are encouraged to prioritise accuracy over speed.
The effect of time pressure considerably reduces this ability the more time pressure is applied, i.e the shorter the deadline and the more the pressure to complete something in a short time frame, the higher the probability of the person being unable to spot a fraudulent website or online scam.
The study highlights that advanced technical knowledge is not a pre-requisite, nor is a higher intelligence. Simply changing the mindset from “act quickly” to “be accurate” improves outcomes.
Another glaring finding was that distraction and urgency suppress analytical thinking, and modern environments are full of interruptions.
Attackers often do their best to recreate these kinds of conditions, but a pause was found to quickly counter this as well.
A second study published in 2022 found that forcing a brief pause significantly reduced phishing clicks. Pauses of between three and seven seconds were effective in shifting the brain from automatic, fast thinking to more deliberate and analytical slow thinking.
A simple, yet powerful, takeaway
A moment of pause, between three and seven seconds, measurably improves our ability to detect deception and prevent phishing attempts.
Cyber risk lives within culture. It is embedded in everyday habits, routine decisions, and the way people operate across an organisation.
Traditional approaches often fail to reflect the reality of modern working environments, meaning learning remains knowledge based, with little practical reinforcement or experiential elements such as emotion, pressure, or context.
Knowing what to do is not the same as doing it under stress. Because our psychology and emotional responses can override established systems, building a deliberate pause into every meaningful or high-risk interaction, even if it takes slightly longer, helps to significantly reduce risk.
A moment of pause, between three and seven seconds, measurably improves our ability to detect deception and prevent phishing attempts.

As leaders set the tone for risk and safety, we approach the problem from a top-down perspective, with the focus on getting the executives on board with a culture change emphasising among others, these three key points below:
- Slowing down is a security skill
Many work environments are pressured, with speed often rewarded and encouraged. Many cyber incidents are successful because people are rushed and make bad decisions. When leaders normalise pausing and double checking and remove the sense of urgency needed in work environments, they create space for better judgement and safer decisions.
- Questioning authority can be protective
Social engineering uses the fear of authority and urgency to exploit victims. When leaders encourage direct but respectful challenge and verification, they remove a key advantage attackers rely on and reduce organisational risk.
- Psychological awareness reduces risk
Recognising how stress, habit and bias affect decisions helps people identify when they are vulnerable. This shifts security from rule following to behavioural resilience embedded in everyday work.
The organisations that understand this are better equipped to reduce risk, adapt to modern threats, and embed security into the way people actually work.
Cybersecurity is not won through more rules, longer policies and tick box compliance training. It is shaped by habits, decisions and moments of pressure where people are required to act quickly with incomplete information.
The organisations that understand this are better equipped to reduce risk, adapt to modern threats, and embed security into the way people actually work.
This is why we focus on the reality of developing a cyber culture. Cyber Dexterity focuses on helping organisations, leaders and teams understand how cyber risk shows up in our daily work routines, and how psychological awareness, realistic scenarios, and behavioural design can shift culture from passive awareness to active vigilance.
Our present and immediate future is haunted by the threat of deepfakes, social engineering and constant distraction. Tech is not enough, the strongest defence is not just what people know, but how they think, pause and apply their knowledge when it matters most.
Basil Polydorou – Head of Learning Solutions | BsC Cyber Psychology Candidate