Explore our Topics:

Women are building their own health AI tools

Rescripted’s Clara leads a wave of women-built AI models designed to provide trustworthy, inclusive health information.
By admin
Sep 16, 2025, 9:01 AM

On September 9, 2025, Rescripted launched Clara, the first large language model (LLM) built exclusively on science-backed women’s health content. Clara is a free, consumer-facing tool for women to conversationally ask health questions and receive trustworthy answers on everything from periods to menopause to postpartum health. It is a response to the misinformation, censorship, and bias that often plague women’s health discussions online. The launch of Clara is a clear signal that women are now building their own health AI tools to provide more accurate, accessible, and inclusive resources. 

Why women’s health needs its own AI tools

So, why would a company like Rescripted dedicate significant resources to building an AI tool focused exclusively on women’s health? The answer lies in the proven biases that exist in existing generative AI models, which have historically been trained on datasets that may not accurately reflect women’s health needs, or even worse, outright marginalize or misrepresent them. 

Generative AI tools like ChatGPT, Perplexity, and Claude are based on massive datasets that scrape content from a variety of sources across the internet. Their responses can be shaped by the biases inherent in the data they were trained on and when it comes to women’s health, this can have serious consequences. A 2023 study found that ChatGPT, “is heavily gender-biased and these biases have performative effects, amplifying inequalities and putting women, men and gender-diverse people at a further disadvantage in society.”  

A 2024 article in the Lancet Digital Health found that GPT- 4 perpetuated bias across multiple clinical simulations, including “differential diagnoses created by GPT- 4 for standardized clinical vignettes were more likely to include diagnoses that stereotype certain races, ethnicities, and genders.” The study also found a significant association between demographic attributes and GPT- 4’s recommendations for more expensive procedures, indicating socioeconomic bias as well.  

Furthermore, when women try to find reliable health information online, they often encounter censorship and misinformation. Social platforms are known for censoring or blocking women’s health-related content, particularly around topics like menstrual health, menopause, and reproductive care. Generative AI models can inadvertently perpetuate these issues when scraping the internet if not specifically designed to prioritize women’s health needs. Clara addresses this gap by offering a resource that is built from a curated, science-backed library of women’s health content. 

Other efforts to improve health tech for women

Rescripted is not alone in recognizing the need for women-centric health AI tools. Several other companies and organizations have made strides to build tools that serve the unique health needs of women. 

  • Diem is building a social search engine focused on closing the gender information gap with privacy as one of its core values.  
  • FemTech Insider created the Femtech bot, an AI-powered femtech research tool that scours all of the articles and research conducted by the female-led media group.  
  • Her Health AI uses AI built on clinical, lab, and imaging data to reduce time to diagnosis of endometriosis.  
  • Ema is a multi-purpose women’s health AI model trained on 10 million conversations with women, reviewed by clinicians, and grounded in nursing and biosocial models. It can be used for patient engagement, scheduling, and clinical navigation.  

Governance and privacy: Navigating the ethical considerations

As with any AI tool, the development and deployment of generative AI models come with serious concerns around governance, transparency, and privacy. At the core of AI tools is the data fed into them, and the more data they process, the more their models evolve. The process of data gathering and model creation must be transparent and accountable. Rescripted’s Clara does this well, with clear links back to the sources of the content.  

When asked about the future of Clara’s content and models, CEO Abby Mercado shared, “Right now, Clara’s answers come from our own content and that of our seven launch partners: Brightside Health, Midi Health, Needed, Gaia, Teal Health, Proov, and MyReceptiva but we’re actively looking to expand. The more partners we bring on, the bigger Clara gets – and the more women’s lives we can touch with reliable, science-backed information.”  

Healthcare lacks clear and standardized guidelines for the ethical creation of health AI models. Independent groups like the Coalition for Health AI (CHAI) are building a robust, transparent certification process to evaluate health AI systems across responsible AI principles. Across the board, transparency could be improved by publishing the datasets used to train these models, allowing external audits and oversight to ensure that the model is not unintentionally amplifying misinformation or bias. Additionally all models should incorporate an easy-to-use mechanism for reporting misinformation and hallucinations, so that any errors or problematic outputs can be flagged and corrected. 

Alongside responsible governance, health AI companies must implement stringent privacy measures, such as data anonymization and opt-in consent. Users should have full control over the data they share and how it’s used. Transparency about data storage, usage, and retention policies should also be a core part of any health tech platform’s governance. 

Moving forward: A more inclusive and transparent future

The launch of Clara marks a pivotal moment in the evolution of women’s health technology, one that prioritizes accuracy, inclusivity, and trust at a time when generative AI’s blind spots are becoming increasingly clear. By grounding its responses in science-backed content and building a model that centers women’s unique health needs, Rescripted is helping to set a new standard for what responsible, women-focused health AI can look like. As more innovators, clinicians, and organizations step into this space, the future of health AI will depend on not only addressing bias and misinformation but also upholding transparency, privacy, and ethical governance.


Katie D. McMillan, MPH is the CEO of Well Made Health, LLC, a business strategy consulting firm for health technology companies. She is also a curious researcher and writer focusing on digital health evidence, healthcare innovation, and women’s health. Katie can be reached at [email protected] or LinkedIn.  


Show Your Support

Subscribe

Newsletter Logo

Subscribe to our topic-centric newsletters to get the latest insights delivered to your inbox weekly.

Enter your information below

By submitting this form, you are agreeing to DHI’s Privacy Policy and Terms of Use.