Explore our Topics:

Can AI really be empathetic in healthcare?

Startup Ellipsis Health claims its AI Care Manager, Sage, can provide more personalized healthcare to patients by using vocal biomarkers.
By admin
Jul 25, 2025, 1:40 PM

Mainul Mondal has made a bold claim. The CEO and founder of Ellipsis Health boasts that his company’s AI-powered voice assistant is “the world’s most emotionally intelligent,” and can tell when you’re sad, anxious, or struggling with your health, responding to you with genuine empathy. There’s at least $45 million riding on Mondal being right. 

The CEO’s assertion is not out of place in an industry full of AI hype and promises, but Mondal’s San Francisco-based startup just convinced investors it might be onto something. Ellipsis Health’s newly announced Series A funding round, joined by Salesforce, Khosla Ventures, and CVS Health Ventures, is intended to scale Sage, the company’s “AI care manager.”

As an AI voice assistant, Sage can conduct the kinds of caring, nuanced conversations that healthcare workers increasingly can’t find time for, including health risk assessments, post-discharge follow-ups, and the “Friday ‘tuck-in’ calls” that many patients appreciate.

Pulling ahead with the empathy engine

Ellipsis trained Sage on millions of real clinical conversations, teaching it to parse not just words but vocal cues that signal emotional distress, confusion, or pain. The company calls this its “Empathy Engine,” a system built on what it describes as patented vocal biomarker technology. 

“It’s critical to understand that care management is not just about collecting information and performing tasks for the patient, but also listening to them, being supportive as they move through their care journey and understanding what are the most important priorities to them,” Mondal explains in the press release.

 The AI care manager analyzes speech patterns, pitch variations, and acoustic characteristics that research suggests correlate with mental and physical health conditions. Voice tremors might indicate neurological issues, for example, and changes in speech pace may signal depression or anxiety. 

Sage isn’t the only game in town when it comes to the vocal biomarker market, which was valued at $580 million in 2023 and is projected to reach $2.13 billion by 2032. Companies in the emerging field are racing to prove that voice analysis can detect everything from Parkinson’s disease to COVID-19, but translating these claims from white papers and boardroom meetings into real clinical care is proving difficult.

“I’ve seen how hard it is to deliver quality care between visits, especially amid staffing shortages and clinician burnout,” said Hal Paz, a former physician and health system CEO who’s now a partner at Khosla Ventures. “Voice-based care management has become essential, but it’s nearly impossible to scale cost-effectively or without significant loss of quality. Sage stands apart by training its AI on real clinical conversations, enabling emotionally intelligent, context-aware support, even for the most complex patients.”

Early users of Sage report a 60% reduction in administrative tasks, six times faster program enrollment, and a four-fold return on investment. The platform integrates with Salesforce Health Cloud, allowing it to slot into existing clinical workflows.

Does it even really care?

The metrics are impressive, but they don’t answer the most important question that Sage and similar technology forces us to ask: can AI really be empathetic?

Ellipsis claims Sage can adjust its tone and approach based on a patient’s emotional state. If someone sounds distressed, it might slow down and use more reassuring language. For patients who seem confused, Sage can repeat information or ask clarifying questions.

To back its hype, Ellipsis points to partnerships with major health systems, including UnitedHealth Group, Aetna, DukeHealth, and Highmark, as validation that its AI technology is performing as predicted. In 2021, Cigna International launched what it called the world’s first voice-activated stress test, developed in partnership with Ellipsis.

As with so many other AI-powered healthcare tools, concern about the limitations of algorithmically calculated features such as empathy are persistent. Can a system trained solely on past conversations truly understand the nuanced needs of individual patients? What happens when Sage inevitably encounters situations outside the scope of its training data?

Patient acceptance is a potential obstacle, as well. Physician use of AI tools nearly doubled from 2023 to 2024, reaching 66% of doctors, but patient comfort with AI caregivers remains unestablished. Though some people might be comfortable sharing sensitive health issues with an AI chatbot, others could find such communication off-putting.

Red tape and an existential crisis

The regulatory path forward for AI-powered voice analysis is still murky. Currently, the FDA has only issued draft guidance for AI-enabled medical devices, including voice-based tools. Privacy concerns around training of AI systems on sensitive health conversations remain unresolved, too, and, at least for now, the clinical evidence base supporting this technology is relatively thin. 

Debate over what the future of healthcare should look like are also weighing down the adoption of AI tools such as Sage. Proponents of these tools believe that the handling of routine tasks by AI is invaluable in freeing human caregivers to focus on complex cases that AI alone cannot address. Skeptics worry that rushing to integrate AI into healthcare practice could push us to prioritize efficiency over empathy 

Despite these challenges, Ellipsis has no intention of slowing down. As Mondal sees it, “This funding validates our approach and enables us to continue scaling Sage to ensure every patient receives high-quality, compassionate care—at the right time and in the right way.”


Show Your Support

Subscribe

Newsletter Logo

Subscribe to our topic-centric newsletters to get the latest insights delivered to your inbox weekly.

Enter your information below

By submitting this form, you are agreeing to DHI’s Privacy Policy and Terms of Use.