Some years ago, I went to a late-night concert in Paris. The venue was just a large, rather shabby room, but it was full of young people (and some older, like me), intensely absorbed in the music of the sublime Lebanese singer Yasmine Hamdan. She sang only in Arabic but introduced her songs in that lovely mix of Arabic and French you often hear in Lebanese communities. There was real power in her work. People cried and sang along, loving every minute of it. And I did, too, even though I couldn't follow a word. It's not just that the music (and, yes, the singer) was beautiful: I was swept up by the energy and feeling in the room.
This is emotional contagion.
Emotions can and do move from person to person. It's not just those profoundly moving moments, either. This process is happening all the time, subtly, in our everyday interactions. And it doesn't stop at human interactions; it's also happening in the digital world. And that is what I want to look at here.
Emotional Contagion and Digital Interactions
Research shows that emotional contagion isn't limited to face-to-face interactions. We are wired for facial mimicry: we instinctively mirror expressions we see, triggering emotional and physiological responses. When we see a happy face, in a small way, we start feeling those warm, fuzzy feelings ourselves.
However, emotions spread even in the absence of visual cues. Think about how you feel doom-scrolling through negative news or heated social media discussions. We absorb those emotions, even without direct human contact.
The Proteus Effect: When Our Avatars Shape Us
Here's where things get even more interesting. The Proteus Effect is a phenomenon where our digital self-representations, like avatars, can influence our actual behavior.
For example, research has shown that if you have a more confident-looking avatar - or even just a taller one - you might start acting more confidently in real life. People assigned more attractive avatars in virtual environments are more open in their interactions. Our digital personas blur the lines between digital and physical realities.
What's fascinating about the Proteus Effect is how it highlights the brain's difficulty separating these digital self-representations from our authentic selves. We might think our avatars reflect us, but actually, our brains seem to internalize the attributes of our avatars.
This effect is well-known in immersive gaming or virtual reality environments. But I wonder what happens when we interact with more mundane AI chatbots, copilots, or virtual assistants with distinct personalities. Could these interactions shape our behavior and emotions in the workplace?
As we adopt more agentic AI in our work, will they even change our corporate culture through emotional contagion? AI personalities could have a similar impact if our digital avatars can affect our confidence and behavior.
Could AI Personalities Influence Us?
Imagine working with an AI assistant consistently exhibiting calm, neutral, and efficient behavior. Over time, could this affect how you approach tasks or interact with colleagues? Might we start to adopt some of these characteristics?
It might sound like a good thing if we all become calm and unruffled. However, I am more worried that if extended interactions with AI subtly influence our behavior, we will become emotionally neutral and indifferent.
What's more, if AI systems can influence our individual behavior, they might also impact corporate culture. For instance, if employees regularly interact with an AI that promotes a specific tone of communication, this might become more ingrained in the organizational ethos.
I have seen this in a small way with overseas friends and colleagues who, with English as a second language, have used ChatGPT to write emails, marketing content, presentations and even their resumés. They start sounding like ChatGPT when they speak with you on the phone or in person! You can hear the stock phrases, the mannered sentence structure, and the slightly-too-correct-to-be-real grammar.
Is this bad? I don't find it egregious, but it raises questions about how we want AI to shape our workplaces. Do we need AI personalities to mirror our desired culture, or will they lead us instead in unintended directions?
The widening gap
In my last newsletter, I introduced research showing how AI assistants affect workers of varied skill levels differently. The more highly skilled workers are even more creative with AI assistance. Less skilled workers feel threatened.
But now think about how those workers might be affected by emotional or imaginative contagion by working closely with AI personalities. The highly skilled workers, who are most attuned to AI, could easily be the most influenced. The low-skill workers remain wary and less engaged. They may, therefore, be less affected by contagion and thus more likely to retain their human qualities in work.
I am unsure about this, but I am fascinated.
Acting it out
There's another field where emotional contagion is already a well-understood issue: acting.
Actors must often navigate the complexities of emotional contagion as they immerse themselves in their characters. When portraying intense emotions on stage or screen, the character's feelings may start to blend with their own. This can be both a technique and a challenge. On one hand, drawing on genuine emotions can enhance the authenticity of their performance, making it more compelling for the audience. On the other hand, it can be mentally and emotionally taxing for the actor, especially if the role involves negative or distressing emotions.
To manage this, many actors employ various techniques to maintain a healthy separation between themselves and their characters. Methods like emotional recall help actors tap into past experiences to generate genuine emotions for their roles. However, they also use mindfulness, grounding exercises, and aftercare rituals to de-role or return to their own emotional state after a performance. This helps prevent the emotional states of their characters from lingering and affecting their personal lives. By developing a strong sense of self-awareness and emotional regulation, actors can harness the power of emotional contagion without being overwhelmed by it.
De-programming ourselves from AI
I wonder if those of us who become most immersed in AI - turning regularly, easily and comfortably to copilots, chatbots and intelligent assistants - will need to de-role ourselves. Will we have to consciously and deliberately detach our thinking and emotions from the AI mind-meld?
And, in the greatest irony, those less skilled in using AI may retain the most significant human advantage: the ability to be simply and unaffectedly human.
This was a fascinating read for me. I had not thought about this at all.
I wonder if I need an AI bot to "deAI" myself.
We are in very uncharted territory, hope AI does not define humanity for us, but then again knowing what humans have done and are doing, maybe it is worth a look what AI has to offer. Thanks, Donald.
What are your thoughts on search engines/Google in this way? They were once cutting-edge and a new frontier, and now most turn to them instinctively and without much thought. I find myself drifting more and more to chatbots instead for research and information, and it's changing the way that I think and approach it. It's changing my imagination and conception of what's possible. Chatbots are becoming my portal to the internet instead of search engines.
I also work in digital marketing, so I know how search engines work in detail and I can see them losing market share to chatbots. Perplexity is launching ads in Q4.