Explore our Topics:

Generative AI will transfigure digital health literacy

Healthcare providers must adapt to this new digital health literacy through brand building, education, and responsible AI use.
By admin
Jul 8, 2024, 9:57 AM

Could Dr. Google be reaching retirement age? Take a look at the implications of generative AI on digital health literacy, and you’ll understand why.

From a historical perspective, a 2020 survey by the Digital Health Consumer Adoption Report found that 89% of U.S. adults use the Internet to look up health-related information. According to the Pew Research Center, around 80% of internet users have searched for health information online. This figure includes searches for specific diseases, treatments, and general health tips.

None of this is shocking to anyone who knows how to enter a Google search using the keyword “rash on my face.” But to those who have already migrated from basic search to generative AI prompts, you’ll already understand that digital health literacy as we knew it is being upended for patients and their caretakers.

Patients faced specific risks when interacting with search tools since the advent of the browser. However generative AI has magnified the risk and, in fairness, many of the benefits of AI as related to digital health literacy.

The risks can impact health outcomes, trust in healthcare systems, and overall well-being. They also further complicate an already strained direct and indirect patient experience with physicians.

Here are some of the key digital health literacy factors related to a rapidly evolving generative model:

  • Cognitive overload: The sheer instantaneous volume of exponential health information output from OpenAI platforms dwarfs what the average Google search delivers. One might rightfully argue that some Google searches result in thousands of hits. But the ability to modify prompts to expand or constrict generative results, the amount of data to consume and digest explodes. Doctors will inherit this overload as they feel obliged to clarify more and more detailed search results their patients bring with them … or text to them!
  • Incorrect diagnosis or misinterpretation of information: Patients might misunderstand AI-generated health information, leading to poor health decisions due to lack of clarity or context. Generative AI has also been known to provide incorrect or incomplete diagnoses, leading patients to take inappropriate actions or delay seeking necessary medical care.
  •  Inappropriate treatments: AI-generated recommendations for treatments might not be suitable for all patients, especially those with complex medical histories or conditions not well-represented in the training data.
  •  Delay in professional medical advice: Relying on AI for health advice might cause patients to delay seeking professional medical care, potentially worsening their condition.
  • Psychological impact and dependency on technology: Patients may become overly reliant on AI tools, potentially neglecting other aspects of healthcare management that require human intervention and judgment. Incorrect or alarming information provided by AI could cause undue stress, anxiety, or panic in patients. This will be even more profound when the original symptoms are related to possible mental health issues. The prevalence of what is now called algorithmic cyberchondria cannot be underestimated. Even worse, patients might place too much trust in AI systems, neglecting the need for human oversight and professional medical consultation. This will also cause emotional battles between patients and family members as was witnessed during the COVID treatment debates.
  • Overconfidence in AI and inequitable access: Patients from underserved or less digitally literate populations might not have equal access to AI health tools, exacerbating health disparities. In these remote and disadvantaged communities, health advice may only be available online, with clarifications in traditional intercity meeting places like churches, barber shops, and hair salons.

What are healthcare providers and payers to do?

Build your brand – The most successful healthcare brands will be those who establish themselves reliability as the most trusted source of digital health literacy. This is no small task considering the noise in the health “illiteracy” area. However, leveraging multichannel messaging and examples of improved outcomes for major ailments will help to establish thought leadership.

Medical education – Most doctors will openly admit shortcomings of their medical education and the need to supplement that in their practices. In addition to burnout prevention, physicians agree that there is a difference between their medical knowledge and using that as a key factor in digital health literacy. By socializing their expertise they become a personal trusted source of knowledge to combat the natural enemy of inaccurate or politicized medical advice.

Generative AI Education – No one will be able to keep patients and clinicians from using generative AI for medical advice. However, regardless of the industry, everyone needs to understand how best to leverage AI to get the best and most accurate results in lieu of direct communications with a professional or clinical colleague. In just months the field of Prompt Engineering has exploded. This is the science of asking the right generative questions and more precise follow-ups to get the most accurate and detailed results possible. This should be required to get your generative AI driving license”!

 

Mitigating these risks involves ensuring that AI systems are rigorously tested, transparently developed, and continuously monitored for accuracy and bias. Educating patients about the appropriate use of AI in health and promoting the importance of professional medical advice are also crucial steps.


Show Your Support

Subscribe

Newsletter Logo

Subscribe to our topic-centric newsletters to get the latest insights delivered to your inbox weekly.

Enter your information below

By submitting this form, you are agreeing to DHI’s Privacy Policy and Terms of Use.