By Stuart Kerr, Technology Correspondent
🗓️ Published: 12 July 2025 | 🔄 Last updated: 12 July 2025
📩 Contact: liveaiwire@gmail.com | 📣 Follow @LiveAIWire
🔗 Author Bio: https://www.liveaiwire.com/p/to-liveaiwire-where-artificial.html
Can Machines Really Feel?
In 2025, the question isn’t whether AI can understand our words. It’s whether it can understand our feelings. Emotional intelligence—once considered the exclusive domain of humans—is rapidly being adopted, simulated, and in some cases, outperformed by machines. From virtual therapists to emotionally aware customer service bots, AI is learning to read between the lines.
A recent study from the University of Geneva found that LLMs trained in emotional cues are outperforming humans in some EQ assessments. These machines recognise tone, infer intent, and adapt their responses with unnerving accuracy. What was once a novelty is fast becoming a default.
The New EQ Arms Race
Big Tech isn’t watching from the sidelines. Amazon, Meta, and Salesforce are investing heavily in emotionally responsive AI. These systems don’t just process speech—they monitor pauses, sentiment, and biometric feedback to shape interactions.
As Forbes recently warned, however, outsourcing empathy may come at a cost. The more we lean on digital assistants to manage emotional labour, the more we risk atrophying our own.
This tension is captured in the Financial Times’ call to action: "Humans must remain at the heart of the AI story". AI should amplify our humanity, not replace it.
Ghosts of the Algorithm
But what happens when emotionally intelligent AI gets it wrong? As explored in our article AI in Mental Health, even subtle misinterpretations can derail trust. A chatbot offering condolences for a breakup might be welcome—unless it misreads sarcasm, or worse, dispenses advice it wasn’t trained to handle.
This is where AI guardrails become critical. Emotional AI must be trained not just on text, but context. Bias in interpreting emotion is as dangerous as bias in data.
And then there's the question of intent. As companies embed EQ simulations into marketing and UX flows, emotional AI becomes a tool of persuasion. It mirrors your mood. It nudges your behaviour. At what point does empathy become manipulation?
Love, Lies, and Algorithms
Romantic and social AI agents are also pushing boundaries. In The AI Love Algorithm, we examined how LLMs are being deployed to mediate relationships, offer dating advice, and even simulate companionship.
Some find comfort in the idea. Others see danger. If a chatbot can pass the Turing test for empathy, are we still choosing real connection, or opting for synthetic safety?
The Human Response
Critics argue that EQ can't be coded. It grows through lived experience, cultural nuance, and vulnerability. While AI may imitate the outward signs of emotional intelligence, it lacks the internal compass that guides human empathy.
Still, as Harvard Business Review points out, the rise of EQ-AI may ultimately elevate the importance of human EQ. In a world of increasingly responsive machines, authenticity becomes a differentiator.
Marc Benioff, CEO of Salesforce, summed it up: "If AI can show empathy, humans must double down on it."
Towards Emotional Infrastructure
The future may hold AI systems with embedded ethics, context-awareness, and emotional calibration. These will be more than tools. They’ll be emotional infrastructure—gateways between people and services, empathy and efficiency.
But for now, the challenge is balance. Let AI lighten emotional labour. Let it assist with care. But let it never replace what makes us, us.
About the Author
Stuart Kerr is the Technology Correspondent at LiveAIWire. He writes about AI’s impact on infrastructure, governance, creativity, and power.
📩 Contact: liveaiwire@gmail.com | 📣 @LiveAIWire