Sponsored By

AI Doesn’t Need to Feel Your Pain. It Needs to Solve Your ProblemAI Doesn’t Need to Feel Your Pain. It Needs to Solve Your Problem

AI can provide accurate, personalized and speedy resolutions – and customers would rather have that over agentic AI talks about their feelings.

Blair Pleasant

February 5, 2025

5 Min Read

Can AI be empathetic? More importantly, should it be? Empathy is generally seen as a crucial capability for customer service interactions, as it lets brands show that they understand customers’ needs and feelings, leading to greater trust and brand loyalty. Empathy helps nurture customer relationships, making customers feel respected and valued while de-escalating stressful situations.

Human agents are expected to be empathetic when interacting with customers, and long lists of “empathetic statements” that agents can use to help customers feel heard and understood. But in the new world of AI, should AI agents/virtual agents try to display empathy, and if so, how can it do this without coming across as fake and disingenuous?

I posed this question on LinkedIn recently, which led to an interesting and engaging discussion.

The main consensus from the LinkedIn discussion was that while AI can’t actually feel empathy, which is a human emotion, AI can be trained to display or portray empathy. AI can be trained to recognize how customers are feeling and respond in a way that appears to be thoughtful and caring.

However, just like with human agents, empathy needs to feel genuine rather than fake and scripted, or it defeats the purpose. As consumers, most of us roll our eyes when a customer service agent says, “I’m sorry you’re having a problem,” as we know it’s scripted and insincere. It’s even worse when an AI agent displays fake empathy.

Many commenters in the LinkedIn discussion noted that AI can provide a more helpful and personalized experience that makes the customer feel understood and comfortable, providing impactful responses to emotional or sensitive inquiries. Tod Famous, CPO at customer service full-stack provider Crescendo pointed out that generative AI can conduct chat/messaging conversations in a way that appears to be more empathetic than agents, creating a customer experience that’s defined by speed, personalization, and reliable problem-solving.

Brian Dawson, chief strategy officer at conversational AI platform NLX, noted that “AI can excel at operationalizing empathy by recognizing context, tone, and intent to respond in ways that feel genuinely helpful and human. For example, instead of generic, ‘I’m sorry’ statements, a well-designed AI can acknowledge the situation more specifically, such as ‘I see you’ve been waiting for a refund. Let’s resolve that right now.’ This shifts the focus from performative sympathy to actionable assistance, which is what customers truly value.”

Is Empathy the Wrong Focus?

We’re putting too much pressure on the concept of empathy. Instead, those of us using gen AI in customer experience should focus on delivering accurate, personalized, and useful responses and resolutions in a friendly tone that makes the customer feel comfortable.

Certainly, there are situations requiring empathy, such as medical, health, or financial issues. In those cases, a human touch is generally preferred. The AI should be able to detect when to hand over an interaction to a live agent who can solve the customer’s issue effectively and empathetically.

While some situations are best suited for empathetic responses, some interactions are simply transactional or informational and require fast and accurate responses without emotion. Ideally the agent – AI or human – would be able to judge and identify whether or not a customer is open to an empathetic response, or if they simply want a quick and accurate solution. I expect that the next generation of AI will be trained well enough to identify whether the customer prefers speed and accuracy or a more empathic response, and will adjust the interaction based on user preference – but we’re not there yet.

Empathy Without Empowerment Can Be Disastrous

During a recent BCStrategies podcast on AI and empathy, we discussed the difference between empathy and empowerment. Whether AI or a human agent, providing empathy without the ability to solve a customer’s problem can make the situation even worse. While empathy is important, providing agents (AI and humans) with the tools, authority, and ability to resolve a situation is what really matters. Empathy plus empowerment is the best solution.

The Takeaways: Empathy Is Good; Sincerity and Solutions Are Better

We can make several conclusions based on responses to my LinkedIn post and the BCStrategies podcast discussion:

  • Both human agents and AI agents can be trained to express empathy and respond to customers in a way that feels thoughtful and appropriate.

  • AI can display empathy in a way that makes customers feel heard and understood – and in some cases even better than human agents who get distracted, are feeling stressed, or maybe having a bad day.

  • Most people are frustrated by insincere or non-genuine empathy, regardless of whether from AI or human agents.

  • The goal is to make customers feel heard and understood while providing fast and accurate information and issue resolution.

  • It’s all about training – properly trained AI can be more effective than human agents, but poorly trained AI can create more customer frustration and dissatisfaction.

For now, while some vendors’ AI solutions can indeed engage empathetically, interactions that truly require empathy and a sympathetic ear may best be handled by human agents. As AI technologies improve, it will become more difficult to differentiate between humans and AI agents, and in many cases, AI agents will perform even better than human agents. Empathetic AI may seem like an oxymoron, but not for much longer.

This post is written on behalf of BCStrategies, an industry resource for enterprises, vendors, system integrators, and anyone interested in the growing business communications arena. A supplier of objective information on business communications, BCStrategies is supported by an alliance of leading communication industry advisors, analysts, and consultants who have worked in the various segments of the dynamic business communications market. 

Read more about:

BCStrategies

About the Author

Blair Pleasant

Blair Pleasant is President & Principal Analyst of COMMfusion LLC and a co-founder of UCStrategies. She provides consulting and market analysis on business communication markets, applications, and technologies including Unified Communications and Collaboration, contact center, and social media, aimed at helping end-user and vendor clients both strategically and tactically. Prior to COMMfusion, Blair was Director of Communications Analysis for The PELORUS Group, a market research and consulting firm, and President of Lower Falls Consulting.

With over 20 years experience, Blair provides insights for companies of all sizes. She has authored many highly acclaimed multi-client market studies and white papers, as well as custom research reports, and provides market research analysis and consulting services to both end user and vendor clients.

Blair received a BA in Communications from Albany State University, and an MBA in marketing and an MS in Broadcast Administration from Boston University.

You May Also Like