
How cognitive overload and multitasking undermine our ability to spot cyber deception
“When cognitive load exceeds working memory capacity, learning and performance are impaired.”
John Sweller, Cognitive Load Theory.
Cognitive Load Theory was first developed to explain why people find it hard to learn in complicated situations. But the implications of this go way beyond education. Spotting cyber deception is a performance task. It relies on attention, working memory, and executive control, the same cognitive systems responsible for judgement, reasoning, and error detection. When these systems are overloaded by constant interruptions, multitasking, and information noise, performance suffers.
Cybercrime doesn’t happen just because people are careless, it also happens because modern work environments are always going beyond what the brain can handle.
When we tell people to “be more aware”, we’re assuming they’ve got the mental capacity to spot small signs, ask questions, and stop and think before doing anything. The truth is, most of us are always under some kind of cognitive overload. So, even the really obvious warning signs can often go unnoticed. But being aware of something doesn’t automatically mean you can do it.
The myth of the attentive user
But before we delve deep into cognitive load theory, it’s important to challenge a common assumption in cybersecurity: that users fail because they’re not paying attention or because they’re careless. But the data tells a different story:
By the time most people get to work, their brains have already been through the wringer from all the digital communication they’ve had to deal with. These days, we’re all drowning in a constant stream of info from emails, messaging apps, notifications, social media, and more. This is often called “information overload,” which is when the amount of data coming in is more than we can handle effectively.
Once you’re in the workplace, the digital demands start to ramp up. Studies show that the average knowledge worker changes tasks roughly every three minutes and, once distracted, can take almost half an hour to get back to the original task. This pattern of interruption fragments attention and increases cognitive effort.
Email is still a big source of workplace information. It seems like most office workers get about 100 to 120 work emails a day. It’s easy to get lost in reading and responding to messages, and studies show we can spend up to two hours a day just on email.
If you’re always being interrupted and having to switch tasks all the time, it can get you down and make you less productive. Research in interruption science says that people often only spend a short amount of time on a task before being interrupted, and then need a lot of time to get back to the original work. This is a big cause of information overload, which we all know can have a negative effect on productivity, decision-making and stress.
Employees say that the amount and how often they’re sent digital messages makes them stressed and causes them to feel burned out. Studies show that being flooded with digital and communication noise — like emails, messages, meetings, and notifications — can lead to reduced focus, increased stress, and cognitive fatigue.
When you’re dealing with these conditions, it’s like your brain just does this thing where it starts to filter everything super quickly just to keep itself safe. So it’s no surprise that most workers say they’ve deleted or ignored work info without reading it. This isn’t negligence. It’s all about adapting.
People are always being interrupted. It’s pretty much impossible to focus on anything these days because of all the email, chat tools, dashboards and meetings. But security uses the same cognitive resources that this fragmentation uses up. If spotting threats depends on noticing small changes, then having too much information is a real problem. It’s the attacker’s best friend.
Cognitive Load Theory and why it matters for security
Studies have shown that we can’t really multitask the way we like to think we can. Our brains just can’t handle having lots of different things to focus on at once. When we say multitasking, what we mean is actually switching tasks really quickly.
Cognitive Load Theory is based on a pretty solid model of how people process information. So, according to this model, memory is made up of three main systems: sensory memory, working memory and long-term memory. Sensory memory briefly registers loads of incoming information, filtering out most of it. Only a small part of that information gets sent to the working memory to be processed consciously.
Unfortunately, working memory is pretty much non-existent. It can usually hold about five to nine separate pieces of info at once, but only for a short while. The info that’s processed in working memory is either thrown out or stored in long-term memory, where it’s organised into structures called schemas. These schemas help us spot patterns, make decisions quickly, and get things done efficiently in familiar situations.
Basically, cognitive load is the amount of mental effort being used in working memory at any given time. When that load goes over capacity, performance drops a lot.
There are usually three types of cognitive load. Intrinsic load is basically just a way of talking about how complex a task is. All that extra load comes from distractions, interruptions, poor system design and unnecessary complexity. Germane load is basically the effort you put in to form and strengthen schemas through learning.
When it comes to security, the key point to remember is that cyber deception works by piling on the extraneous load and hiding the malicious stuff in everyday tasks. Phishing emails aren’t that tricky to spot. The environments they arrive in, are.

Why multitasking kills threat detection
Multitasking means you’re always switching between tasks. Each switch has a measurable cognitive cost. Working memory has to switch tasks, reorient, and then reconstruct the context when switching back. This process uses up cognitive resources even when the switch feels effortless.
The more we switch, the less working memory we’ve got available. People’s attention gets fragmented. The executive control is becoming weaker. Threat detection, which relies on all three, suffers as a result.
When there’s a lot on your plate, your brain switches from slow, analytical thinking to faster, heuristic-based processing. People tend to trust what they’re familiar with, what they’re told by people in authority, when they’re in a hurry, and emotional signals. This isn’t lazy. It’s neuroeconomics. When there’s a lot on, your brain focuses more on speed than on getting things right.
This shift lines up pretty closely with the usual cyberattack techniques. Subject lines that are urgent tend to be about time pressure. Authority impersonation uses social hierarchies. Emotional manipulation doesn’t allow for rational thinking. Routine-looking messages blend seamlessly into background noise.
Attackers aren’t trying to trick unintelligent people. They’re trying to wear out their brains by using shortcuts.
What cognitive overload does to judgement
The prefrontal cortex is really important for things like decision-making, spotting mistakes, controlling impulses, planning, and working memory. It mixes sensory information with what we already know, controls how we feel, and helps us think things through.
When we’re dealing with a lot of information, the part of the brain that controls thinking (the prefrontal cortex) gets less active. When you’ve got a lot on, your brain relies more and more on faster, more automatic ways of processing things. This can lead to less impulse control, less critical thinking, and less ability to spot anomalies.
It’s going to have a massive impact on how we work. People under pressure tend to click first and think about it later. Security warnings just get lost in all the noise. People’s ability to take in information actually gets reduced when they’re overloaded. This isn’t a metaphor. It’s a well-known thing called perceptual narrowing.
Studies on inattentional blindness show that when we’re focused or overloaded with information, we can’t spot things right in front of us that are really noticeable. When there’s too much going on, people don’t just ignore threat cues. They’re often not even on their radar.
How cybercriminals design for overload
Cybercriminals don’t just operate randomly, they deliberately target moments of peak cognitive load. These are things like end-of-day fatigue, high-volume email periods, payroll cycles, holiday seasons, crisis events, and organisational disruptions. All of these create ideal conditions for exploitation.
The rise of AI-generated phishing has made this worse by increasing the number of cases, the speed, and how realistic they are, which makes it harder to spot the messages and makes them seem more real.
The attacker’s job isn’t to be convincing. Their job is to arrive when the victim is in a state of cognitive decline. This is why even experienced professionals can fall victim to well-timed attacks.
The problem isn’t about knowing stuff, it’s about having the right skills.
What this means for security strategy
If being overloaded with information affects how well we spot threats, then just being aware of the problem isn’t enough. Training can’t make up for full working memory. Warning banners don’t work when people are distracted, and adding more information often makes things worse for the attackers.
Security design should make things easier to use, not harder.
So, what does all this mean? So, companies need to:
- Instead of adding more decisions, try to reduce the ones you already have.
- Make the design really stand out so it’s easy to see what it’s meant to do.
- Get people to behave in the right way by creating habits and using the right environment, instead of counting on people to remember what to do when they’re under pressure.
But if cognitive overload is unavoidable, the real question is not how to reduce it, but how to operate safely despite it.
So what can actually be done under cognitive load?
If cognitive overload is unavoidable, then security advice based on the idea of calm, focused decision-making is flawed at its core. Rather than trying to eliminate cognitive load, the focus should be on operating safely despite it.
In high-pressure environments, people do not fail because they do not know what to do. They fail because the conditions under which decisions are made overwhelm the cognitive processes needed to make good decisions. Therefore, countering cognitive load is less about providing additional guidance and more about changing how judgement is supported in the moment.
Shift from decision-heavy behaviour to recognition-based behaviour
When under pressure, the brain is much better at recognising patterns than analysing new information. It is unrealistic to expect employees to evaluate every email, message or request from first principles. Therefore, security behaviours should be designed around recognition cues rather than rule recall.
This involves reinforcing a few high-signal indicators that automatically prompt a pause. Unusual requests involving urgency, payment, credentials or authority should prompt an automatic “pattern break”, rather than a checklist. When recognition becomes automatic, cognitive effort is reduced at the moment it matters most.
Separate speed from risk using a deliberate pause mechanism
Cognitive load distorts our perception of time. Everything feels urgent, even when it is not. One effective countermeasure is to introduce a micro-pause before taking high-risk actions. This is not to slow work down, but to create a brief assessment window.
This can be framed internally as ‘seize and assess’, without using the words explicitly. Seize the moment when something feels slightly off. Ask yourself one question: Does this request affect risk, money, access or identity? If the answer is yes, the default response should be escalation rather than further analysis.
This approach is effective because it reduces cognitive demand. One decision replaces many.
Externalise judgement instead of relying on memory
Working memory becomes unreliable under load. It is unrealistic to expect people to remember policies, processes or training content during periods of high pressure. High-resilience systems externalise judgement by embedding it in tools, workflows and social norms.
Clear escalation paths, visible ‘safe to check’ signals and a culture that reinforces the idea that it is OK to pause reduce the need for individual judgement under pressure. When asking for help is frictionless and encouraged, cognitive load is shared rather than being borne alone.
Design for error, not perfection
Cognitive load makes errors inevitable. Resilient security environments do not assume perfect attention. Instead, they design for early detection, containment, and recovery, accepting that mistakes will happen.
This shifts the goal from ‘never click’ to ‘fail safely’. Fast reporting, low-blame responses and rapid containment minimise the impact of unavoidable errors. This approach is far more aligned with how the brain behaves under stress.
Reinforce behaviour through repetition, not awareness
When under overload, behaviour defaults to habit. One-off awareness campaigns have little impact on outcomes because they rely on conscious recall. Repeated exposure to the same behavioural cues, scenarios and responses fosters automaticity.
This is where reinforcement, simulation and experiential learning come in. The aim is not to impart more information, but to ensure that the correct response becomes the most intuitive one when attention is compromised.
Reframing the solution
Cognitive load theory does not suggest that people should try harder or focus more intensely. Rather, it tells us that performance depends on the conditions. If these conditions remain unchanged, simply being aware of them will not be enough.
The organisations that perform best under cyber pressure are not necessarily those that provide the most training, but rather those that design systems, behaviours and cultures that facilitate sound decision-making when cognitive capacity is stretched.
In an overloaded world, resilience is not about perfect attention. It is about designing environments that work with the brain we have, rather than the one we wish we had.
Attention is a limited resource. In a world of constant cognitive overload, security is not just about knowing what to do. It’s about designing environments, systems and behaviours that make the right action the easiest one, even when attention is stretched.
Taryn-Lee Potgieter – Head of Brand Growth | Psychology student at SACAP