Explore our Topics:

What does the rise of empathetic AI mean for healthcare?

Empathetic AI is healthcare's latest saving grace, but what does its application mean for the workforce and patient care?
By admin
Mar 28, 2025, 10:42 AM

In an industry where human connection has always been paramount, artificial intelligence is making unexpected inroads not just through diagnostic accuracy or operational efficiency, but through something more fundamental: empathy.

The question healthcare systems, clinicians, and patients now face is whether AI can truly understand emotional nuance in a way that enhances rather than diminishes the human experience in medicine. As machine learning advances rapidly, experts across the healthcare spectrum are asking: Is empathetic AI possible, beneficial, or even necessary?

Defining empathetic AI

When discussing empathy in artificial intelligence, the definition matters. Alan Cowen, CEO and chief scientist at Hume AI, argues that true empathy isn’t about what’s happening inside an AI system, but rather about behaviors.

“The dictionary definition of empathy is understanding and sharing somebody’s feelings,” explained Alan Cowen, CEO and chief scientist at Hume AI—a company that specializes in developing AI systems that can recognize emotional cues in human speech and optimize responses accordingly. Speaking at the “Is This Real Life? Is AI Just Being Empathetic?” session at ViVE 2025, Cowen addressed the growing intersection between artificial intelligence and emotional intelligence.

“What really matters is not what’s going on in your head, but your behavior. I define empathy as behaving in a way that’s consistent with someone who really cares about somebody’s experiences, feelings, and well-being.”

This behavioral definition makes building empathetic AI more practical. Instead of trying to create machines that “feel,” developers focus on systems that detect emotional cues and respond appropriately.

“We have cognitive empathy, which is being able to identify emotions,” said Dr. Jenna Glover, Chief Clinical Officer at Headspace. “AI might actually be better poised than humans because it can detect facial expressions and voice intonations in ways humans might not be able to.”

“There’s affective empathy, which is actually feeling the person’s feelings. AI can simulate that, but can’t replicate it.”

Empathetic AI in action

Several companies are bringing empathetic AI into healthcare with promising results.

Lark Health has developed an AI coaching system for chronic condition management. 

“We brought together a panel of experts from the chief medical officer of the American Diabetes Association to what I consider the compassion empathy team, focused on how to automate cognitive behavioral therapy and positive psychology,” said Julia Hu, CEO and co-founder.

“Growing up with many chronic conditions, I had a 24-7 care team—my dad and my pediatrician. They were so loving and caring. When I started Lark, I really wanted to build something that was compassionate and could be in your pocket.”

The company has published 20 peer-reviewed journal articles demonstrating that their AI-based text messaging coaching produces outcomes equivalent to those achieved by human coaches and nurses. This substantial body of evidence has helped Lark overcome the initial skepticism they faced when introducing AI to healthcare settings. 

“AI a few years ago was a very scary word for healthcare,” Hu recalled. “We walk in and in front of esteemed clinicians, and one CMO actually said, ‘Oh, you mean like Skynet in Terminator?'”

Their system has exchanged approximately 400 million text messages with patients in a single year—an interaction volume that would have required nearly 15,000 full-time nurses to handle. This scalability has enabled Lark to partner with health plans as a medical benefit covering 32 million lives and to work with pharmacy benefit managers to manage costs related to GLP-1 medications.

Their AI nurse has now treated approximately 2.5 million patients with chronic conditions, positioning Lark at the forefront of empathetic AI in healthcare.

Headspace, known for its meditation app, has expanded into broader mental health services with the introduction of “Ebb,” an AI companion trained in motivational interviewing techniques. Unlike some AI applications, Ebb was specifically designed by clinical psychologists to provide mental health support between therapy sessions.

“Ebb can do open-ended questions, reflections, affirmations, and summaries,” explains Dr. Jenna Glover, Chief Clinical Officer at Headspace. “What we’re finding is that our members who are using Ebb are saying that they feel that Ebb is supportive, nonjudgmental, compassionate, and validating.”

The company’s research suggests openness to AI in mental health is higher than might be expected. “Every year we do a workforce state of mind survey,” Glover notes. “I was fascinated this time around to see that 92% of HR leaders said they’re interested in using a mental health solution that utilizes AI, and 89% of employees said they’d be comfortable using it.”

Headspace believes AI could help address specific demographic gaps in mental healthcare. “Men have been slower to adopt and engage in therapy, relative to women,” Glover points out. “But we know that men are usually more early adopters of technology. And so that could be a great case for men getting more mental health care.”

Similarly, AI might reduce barriers for people with highly stigmatized conditions. “Think about substance use,” says Glover. “I was very scared to get help because I was a licensed psychologist. I don’t want to talk to anybody about that. What if I were to lose my license?” For such individuals, AI might offer a safer entry point to treatment.

Headspace evaluates its AI using standard clinical measures. “With Ebb, we use the motivational interviewing treatment integrity skill,” Glover explains. “That’s the exact same way that you would evaluate the effectiveness of a human delivering that same intervention.”

The impact can be significant. Glover shares: “We had one of our members who was in therapy saying, ‘I’ve been in therapy for years, and in-between sessions, I’ve been using Ebb to help with self-reflection with journaling and thinking about things that I would want to talk to my therapist about, but can’t in that moment until the next appointment, and I feel like I’m making more progress than I ever have.'”

The clinical impact: Beyond efficiency

While operational efficiency is an obvious benefit of AI in healthcare, empathetic AI offers something more transformative: better patient outcomes through continuous support.

Dr. Clark Otley, Chief Medical Officer of Mayo Clinic Platform, sees direct clinical benefits from AI that can predict outcomes and provide personalized treatment recommendations. “Our cardiologist can take a simple ECG tracing and look into it with machine learning to figure out ways to predict things that aren’t related to somebody’s heart rhythm, like will they go into heart failure in the future,” he explains.

Mayo Clinic is developing approximately 300 AI-powered algorithms and has about 70 in production. Their approach balances innovation with safety, using what Ottley describes as “trustworthy AI guidelines” and a rigorous qualification process.

Does empathetic AI belong in healthcare?

Despite promising applications, skepticism remains. About 60% of patients express hesitation about AI involvement in their healthcare, according to Ottley.

To address these concerns, experts emphasize several key points:

  1. Human oversight remains essential. “At the moment, even if the technology was perfect tomorrow, I don’t think as a society we’re there yet,” Ottley notes. “The reassurance provided by having a human in the loop is critical.”
  2. AI addresses real problems. “We have more mental health providers leaving the field than we have coming into it,” Glover explains. AI can reduce administrative burden and extend care capacity.
  3. The alternative is often no care at all. “If you have an empathetic AI companion who can help tap you on the shoulder and interact with you in-between sessions, you’re going to get more reps in,” says Glover. “You’re going to strengthen those neural pathways.”

Future implications

As empathetic AI continues to develop, experts predict significant changes in healthcare delivery within five years.

Cowen believes that by 2030, “We’ll have AI that is indistinguishable from human voice to voice, video to video. There won’t be a need for humans in telehealth.” He adds that new challenges will emerge, including potential addiction to AI interaction.

Glover expects more nuanced conversations about regulation and ethics. “We’ll be talking about what’s on-label versus off-label use of AI,” she says, along with discussions about AI’s impact on loneliness and social connection.

Otley offers a more optimistic vision: “We’re going to be talking about how awesome AI is in healthcare and how much better our healthcare is five years from now. We’re going to have happier, more productive doctors who are taking the best care of their happy, fulfilled, personalized patients.”


Show Your Support

Subscribe

Newsletter Logo

Subscribe to our topic-centric newsletters to get the latest insights delivered to your inbox weekly.

Enter your information below

By submitting this form, you are agreeing to DHI’s Privacy Policy and Terms of Use.