Explore our Topics:

4 steps for integrating AI into physician practices

The AMA has provided a four-phase framework for bringing artificial intelligence into physician practices.
By admin
Mar 21, 2024, 1:49 PM

Artificial intelligence seems to be everywhere – except in the hands of front-line clinicians and staff in physician practices. At the moment, the level of excitement among technologists and executives isn’t matching the rate of adoption in the clinic, creating the risk of real-world patients getting left behind. 

Only a fraction of physicians are using AI for one or more clinical or administrative use cases, revealed a 2023 survey by the American Medical Association (AMA). In the poll, only about 10% of respondents said they were leveraging AI for any given use case, and more than 60% of respondents across specialties said they aren’t using any AI-powered tools at all. 

However, the poll stressed that physicians are eager for those numbers to change, and would like to be actively involved in the process of bring AI into their daily workflows to streamline administrative tasks and assist with clinical decisions.   

To support the infusion of AI – dubbed “augmented intelligence” by the AMA to emphasize its assistive role in decision-making – the AMA has provided a four-step framework to guide the adoption of advanced tools in the physician office environment. 

Step 1: Identify the challenge and use cases

AI isn’t a panacea for everything that ails a practice. Instead, it must be applied strategically to situations where it can bring the most benefit. Carefully identifying those use cases is the first step for success, the AMA says. 

Practice leaders should start by bringing together key stakeholders in the organization, including technology teams, clinical champions, and administrative leaders to define the challenges they are facing and consider how AI might assist.  

These teams should consider what problem has to be solved, such as excessive time devoted to prior authorizations or the need to engage patients between visits, and define what success would look like using a combination of process and outcome measures to gauge ROI. 

The AMA notes that leaders will also need to answer tricky questions of risk and liability when considering AI implementation. Consulting legal and ethical authorities on AI might be beneficial during this process. 

Step 2: Evaluate AI tools

Next, leaders will need to survey the market to discover whether AI-driven tools exist to assist with the problem and develop a comprehensive strategy for assessing the quality, value, reliability, and workflow impact of these tools. 

Members of the implementation team should work with product developers to understand what data was used to train the model in question, what evidence is available to back up the reliability and results of the tool, and if the tool has been externally validated through FDA channels, peer reviewed research, or other means. 

Potential adopters will also need to learn more about how the tool integrates into existing infrastructure and workflows, as well as what up-front and downstream costs may be associated with forming a partnership. 

Leaders should actively integrate the perspectives of clinical and administrative end-users during this part of the process, especially since close to 90% of physicians in the 2023 poll stated they would like to take part in evaluating and implementing AI tools. 

Step 3: Implement AI tools

The implementation phase requires close attention to organizational change management, including end-user education and training.  

A clear and comprehensive training plan will be required to ensure users are confident interacting with the tool and fully understand how their workflows will change to leverage new capabilities to the fullest. 

Training should include discussions of the ethical use of AI and how to avoid unintentional bias that may have an impact on health equity or the accuracy of model outputs.  

In addition, users should be given direct, non-punitive methods of reporting errors or concerns and access to ongoing tech support and coaching to ensure compliance with new procedures over time. 

Step 4: Monitor and manage AI tools

Since many AI models learn and adjust over time, it is essential to have a strong ongoing management plan in place to course correct, if necessary. Users should be made aware of when updates or data corrections will occur and how to manage any workflow changes that may result. 

At the same time, leaders should monitor the clinical and administrative environments for positive or negative impacts using well-defined metrics that map back to the original use case.  

For example, to gauge the impact of an AI tool designed to reduce documentation burdens, leaders may wish to look at quantitative factors such as documentation time per user before and after implementation, as well as qualitative metrics such as physician satisfaction scores or self-reported levels of burnout.   

As AI evolves in the healthcare environment, embracing the notion of continuous improvement and ongoing watchfulness will be crucial for maximizing the value of these tools in the real-world setting. 

Following this four-step framework could give physician practices a head start on converting enthusiasm into impact for patients and staff who stand to benefit from this innovative class of data tools. 


Jennifer Bresnick is a journalist and freelance content creator with a decade of experience in the health IT industry.  Her work has focused on leveraging innovative technology tools to create value, improve health equity, and achieve the promises of the learning health system.  She can be reached at jennifer@inklesscreative.com.


Show Your Support

Subscribe

Newsletter Logo

Subscribe to our topic-centric newsletters to get the latest insights delivered to your inbox weekly.

Enter your information below

By submitting this form, you are agreeing to DHI’s Privacy Policy and Terms of Use.