Sponsored

Realizing AI’s potential: 5 considerations for assessing AI-powered clinical tools

Advertisement

Many health systems are embracing AI across various applications, including documentation, the revenue cycle and claims denials. While this technology shows tremendous potential in improving the delivery of healthcare, more work is needed to realize AI’s full potential in clinical use cases.

To better understand the current state of AI in clinical workflows, what health systems should prioritize and how to ensure trust when adopting AI, Becker’s Healthcare recently spoke with Ed Lee, MD, chief medical officer at Nabla.

Editor’s note: Responses have been lightly edited for length and clarity.

Question: AI is transforming clinical workflows, from documentation to decision support. In your view, what does an ideal AI-powered clinical environment look like?

Dr. Ed Lee: Despite advancements in modern medicine, healthcare is still too expensive; there is a physician shortage; too many medical errors occur; and physicians spend too much time on non-clinical tasks. As a result, more than 50% of clinicians are burned out.

I see an AI-powered clinical environment improving each of these areas. In the not-too-distant future, AI agents will analyze large amounts of data and work with clinicians or other agents to perform tasks that will accelerate and improve the efficiency of care.

AI applications are already making a difference in changing the environment of care delivery. AI scribes and assistants like Nabla are reducing administrative burdens and decreasing burnout. Clinicians who use these AI tools can’t imagine seeing patients without them. Physicians are postponing retirement because they’re reminded why they went into medicine — these tools are allowing them to enjoy providing care again.

Q: Health systems evaluating AI solutions often conduct rigorous pilots to determine the best fit. From your perspective, what key factors should organizations prioritize when assessing AI-powered clinical tools?

EL: Here are the top things I think about.

First, does a tool do what it’s supposed to do? Does it solve the problem it’s intended to solve? Second, is the tool easy to learn and intuitive to use? You can have the best tool in the world, but if it’s too hard to learn, people won’t use it. Third, how customizable is it? People use solutions in different ways, making flexibility essential. Fourth, does a tool fit seamlessly within an organization’s workflow?

Nabla excels at integrating into existing EHR workflows. For example, in one click, a clinician using Epic can create an AI-generated note that is automatically entered into the patient’s chart. We support a number of workflows, including the use of Epic’s Haiku app, Nabla’s app, desktop, and mobile web.

The last consideration is value. Resources are limited. Margins are razor thin. Health systems are looking to get the best value out of AI tools. Value is the highest-quality product and best service at the most reasonable cost. When looking to scale an AI tool across an enterprise, it only makes sense to look for a tool that provides the best value.

Q: As AI adoption grows in healthcare, privacy, security and governance remain critical concerns. What best practices should health systems follow to ensure compliance and maintain clinician and patient trust?

EL: Trust is incredibly important. There is a saying, “Trust takes years to build, seconds to break and forever to repair.” Trust is foundational, especially in the patient-physician relationship and the health system-vendor partner relationship.

One key to ensuring trust is investing to be compliant with the latest cybersecurity standards. It’s also critical to have an AI governance structure that enables innovation but is grounded in safe, sound practices so organizations are not at risk.

Have conversations with vendor partners to ensure they are keeping up with the latest standards. Ask vendors what they do to mitigate risk. What are their retention policies? What are they doing with data?

These are areas where Nabla shines. Within the Nabla AI assistant, the audio record for patient encounters is not stored. Also, by default, the note Nabla creates is only retained for 14 days, but can be configured by the health system to be shorter or longer. In addition,it’s never used to train AI models, unless there is expressed consent.

Q: With AI-driven tools like voice recognition and real-time documentation gaining traction, how do you see these technologies evolving — and what impact will they have on clinician efficiency and patient care?

EL: AI tools will dramatically improve clinician efficiency and patient care.

Today’s tools are just the tip of the iceberg. Functionality will continue to expand. New use cases will be tackled. Deeper integration with EHRs will occur.

AI tools will become even more embedded into how clinicians practice medicine and how the entire healthcare ecosystem functions. AI tools will become full-service platforms. That’s where Nabla is headed.

Examples of additional AI functionality that Nabla is offering or developing include straight dictation, functionality around clinical documentation improvement and being able to review a summary of a patient’s chart before seeing the patient, as well as tools around CPT codes and clinical decision-support.

These enhancements will impact the quality and efficiency of care — and that impact will be profound. It will be like the shift from paper charts to EHRs. There will be bumps along the way but clinicians will remember how things used to be and will never want to go back, because things are going to be so much better with AI tools at their fingertips.

Advertisement

Next Up in Innovation

Advertisement