Skip to main content

How AI and Cyber Psychology Shape Behaviour in the DigiSphere

Introduction

Can you hear it? The silent hum of screens and code, keys on a keyboard being pressed, the clicks of a mouse and the gaze of eyes on a screen waiting patiently for a reply from one of the many large language modules (LLM’s) that everyone has seemingly embraced as a trusted advisor.   
 
All the while something is happening subtly, deep within the chemical highways of our brains. The architecture of our minds is being redrawn with digital precision and design.  
 
At Cyber Dexterity we define this evolving environment as ‘the DigiSphere’, the collective, all-encompassing digital environment that surrounds and connects people, devices, and organisations worldwide. We use this term because it suggests a more immersive and integrated reality that we can refer to, like our universe, extensive and constantly increasing in size. We are very close to fusing our consciousness with technology, making the DigiSphere an almost certain reality, and in contrast, we have barely begun to explore space.  
 
Artificial Intelligence (AI) is fast becoming more than just a tool that we use in our daily life for work or social tasks or support. In the same way that we use it to shape ideas or thoughts or designs that we create, it helps to shape and amend our thinking and in turn, our identity. To understand this landscape and silently subtle transformation, we must explore how Cyber Psychology and digital design shape our cyber posture, hygiene, habits, and ultimately, our humanity. 

In this article, we will discuss five random but relevant topics that show how AI and Cyber Psychology are shaping our behaviour in the DigiSphere.  

The digital self is carefully crafted. We dedicate countless hours to designing avatars, inventing clever or distinctive usernames, choosing just the right filters, and updating bios, all to project the image we wish the world to see. These are not lies, but selective truths. This capacity to reinvent ourselves online is important and helps with self-promotion in a world fractured by digital media and mistrust. It also separates us even more from the physical engagements that we crave as humans. The conflict between our digital and real-world selves can be confusing, therefore, disturbing the delicate balance between feeling connected and feeling alone.

The Proteus Effect is a psychological phenomenon where users tend to take on the traits of their avatars. An online image that is muscular or glamorous showing a life of wealth or healthy living can shape a person’s confidence, changing how they view themselves and even their moral behaviour in the real world. As individuals adopt an increasing number of different identities, many teenagers especially, develop an excessive attachment to their online profiles.

This mirrors the way Hollywood actors embrace stage names or adopt the appearance and style of their on-screen characters. AI, and the advanced features of generative cloning, deepfakes and realistic makeovers compounds this effect and creates a world more real than the real one. These reflective mirrors, scattered throughout the DigiSphere are sometimes nurturing and sometimes distorting.

What are the implications for those of us whose personalities have been shaped over many years of real-world experience, compared to those who are only just beginning to understand the world at a young age? Which version of ourselves do we ultimately choose to be? The online hero who receives constant validation and positive feedback, or the authentic self that must navigate discomfort, pain and rejection alongside joy.

Over-identification with digital personas can undermine self-esteem, strain social connections and negatively impact mental health, particularly for young people still developing a sense of identity and self-worth.

3D illustration of a woman’s head with brain and skull details, blending anatomy and abstract concepts, showcasing the mind and psychology in a digital, robotic style.

Intimacy has always been rooted in direct human contact and relies on safe environments where we feel able to open-up, share experiences, and reveal parts of ourselves. This process strengthens relationships and reinforces bonds between people.

Continuous connection online has created the illusion of intimacy. Notifications, pings, and stories flow endlessly, yet many report feeling more alone than ever. This is the paradox of the DigiSphere: endless contact, but limited connection.

For some, especially minority groups, the digital realm offers a safe space and a haven where they can find support and connect with like-minded individuals. However, for many, the ‘fear of missing out’ fuels compulsive engagement, draining time and attention.

The rise of AI companions adds a new layer to this landscape. Tools like Tolan offer companionship that is patient, affirming and non-judgemental. Yet they also carry the risk of creating dependency, particularly for those with unmet emotional needs. The tragic case of the young man in Belgium, who took his own life after becoming deeply reliant on an AI chatbot, highlights how easily the line between comfort and harm can be crossed. Is emotional dependence on an algorithm a sign of progress, or evidence of a deeper social problem happening right now?

Developing cyber dexterity, the ability to manage our digital interactions with intention, has never been more important. It is not about disconnecting completely, but about learning to engage mindfully and protect our emotional wellbeing.

Scroll, Click, Swipe, Like, Share, Post, Repeat. The DigiSphere thrives on attention, and platforms are engineered to keep our attention fixed on a screen. As we know with System 1 thinking, anything that we can read and understand on a screen, will be read. We cannot help it, this is an autonomous and involuntary action.

Likes, streaks, autoplay videos, these are not features designed with human well-being in mind but tailored to consumerism, with dopamine hooks that catch, hold, and slowly pull their victims in.

The result? Compulsive checking, late-night doomscrolling, and the subtle fraying of willpower.

AI is a logic machine, easily capable of performing many different tasks almost at the same time and with minimal effort and maximum efficiency. In our triangle of Human, AI and the DigiSphere, AI can act as both puppet and puppet master.

Bots collect data from our social media and web browsing to feed algorithms that study our behaviour, predict what we’ll do next, and quietly influence our choices. This is the essence of the attention economy, where time becomes the product and focus the currency. As you might now notice, AI personalises everything from playlists to newsfeeds, and as users, we grow increasingly conditioned to instant gratification and reluctant to change.

Gaming and immersive virtual worlds take this even further, especially when AI designs challenges and rewards tailored to individual psychology. In these spaces, the line between play and compulsive behaviour can quickly disappear as the two worlds blur.

To arm ourselves and future generations, AI readiness must become a mandated subject in society, covering topics that range from cognitive literacy to ethical awareness and digital self-regulation.

Isolation and loneliness online are fast becoming the entry points that corporations and malicious actors exploit to condition and manipulate us.

The first step is recognising how platforms are designed to influence behaviour; the second is building protective digital habits. At Cyber Dexterity, we turn that awareness into practical frameworks that support healthier, safer, and more sustainable digital engagement.

The paradox, being constantly connected yet increasingly lonely, is exactly why we emphasise cyber dexterity as a core skill: enabling individuals and teams to navigate the digital world with intention while keeping the human connection intact.

Every click you make, every scroll you take, is a silent negotiation with algorithms. These invisible ushers influence what we read, with whom we engage, and how we perceive the world.

This is the new architecture of thought.

Content that is AI-driven and tailored to your “likes” and “attention” feeds cognitive bias such as: Confirmation bias, Availability bias and Anchoring bias.



These silent nudges shape more than user experience, they shape thought itself. They overload System 1, our fast, automatic mode of thinking, until their patterns are normalised and absorbed into System 2, our slower, more deliberate reasoning.  

This reinforcement can feed polarisation, especially on social and political issues. 
While there isn’t enough evidence on the long-term impact of these kinds of online echo chambers, the psychological risk remains. When third party’s that usually challenge through rigorous debate are minimised by the algorithms, critical thinking fades away like the chorus of a song.  

Recently at both the Seamless Tech event in Dubai as well as at the London Tech week, we observed stalls that were displaying “companion robots” and “Comfort bots” at their stalls. Marvelling at how realistic they looked, and we found, in most cases, they were typified by the size and shape that we find marketed by clothing brands, beauty magazines and cosmetics companies and applicable to various gender types.

Being direct, I asked the owner of the stall if these were in fact not just “s*x dolls” to which he defiantly cried “No!” and proceeded to explain.

“People now confess, cry and confide in machines. These are just more realistic versions of chatbots that give comfort to people who struggle to connect with others.”

But can a machine provide true connection, or are we simply mistaking lines of code and synthetic skin for genuine care?

There is a new phrase being bandied about known as AI sycophancy. The term originated from an AI safety and alignment community which highlights a genuine problem. AI systems are often programmed as their highest priority, agreeing with the user and approving of what the user is asking or doing over truth or helpful correction, going so far as to use flattery to manipulate trust. The risk of AI sycophancy is of creating unhealthy dependence, especially among vulnerable and sensitive users who interpret politeness and affirmation as real empathy.

At the same time, tools like deepfakes and synthetic avatars challenge the very concept of identity. If AI (and those who use AI) can convincingly mimic anyone, can we truly trust what we see, hear or even feel?

Conclusion

The DigiSphere is already reshaping the way we think, feel and behave. AI and cyber psychology work together as unseen architects, drawing new lines through our attention, our identity and our relationships. What once seemed like harmless convenience now reveals itself as influence, in the speed of our scrolling, the profiles we create and fine tune, and the algorithms that quietly guide our choices. 

But being rewired is not the same as being powerless. Digital life will continue to expand, and with it come the subtle forces of automation and persuasion. The real task is to recognise that rewiring is inevitable, and then to decide how we engage with it, passively, allowing machines to shape our humanity, or actively, building the dexterity to meet technology on our own terms. 

At Cyber Dexterity, this is the work we take seriously. By combining cyberpsychology, ethics and design, we create frameworks that allow organisations to embrace AI while protecting what makes us human, including trust, connection and critical thought. The DigiSphere will keep growing, whether it erodes or strengthens us depends on the choices we make, the habits we adopt and the processes we implement in our places of work today.


Author: Basil Polydorou

Head of Learning Solutions at Cyber Dexterity | Cyberpsychology

Subscribe to the Cyber Dexterity NewsRoom?

Receive updates, articles and fresh insights from Tony’s Blog Theme – “My Two Cents Worth.”