'AI will happen either with us or to us' — Here's what to know

Artificial intelligence and machine learning have captured the imagination of healthcare IT stakeholders in the last two years. Amid the excitement and interest, IT experts are debating AI's role in direct patient care.

This content is sponsored by Intel

Having a thoughtful vision about AI implementation in the hospital setting is integral to realizing many of AI's opportunities in patient care, said John Sotos, MD, CMO of Industry Sales Group at Intel Corp., during a workshop April 18 at Becker's Hospital Review's 8th Annual Meeting in Chicago.

Dr. Sotos was joined by a panel of three healthcare IT experts to discuss their perceptions of AI and what needs to happen for the healthcare industry to embrace AI systems. Panelists discussed the potential problem of physician resistance to AI in patient care and the opportunity for AI to serve physician's professional needs. They also considered how regulations could encourage providers to adopt AI.

Defining artificial intelligence

Artificial intelligence is a nebulous term that can mean different things depending on who you talk to, Mr. Sotos said. He breaks AI into two basic categories: Rules-based AI systems and AI that uses machine learning and deep learning. 

The simplest form of artificial intelligence is the rules-based system created in the 1960s. This system consists of a series of if-then statements translated into code. Many physicians and healthcare professionals are already comfortable using this type of AI because it can show the step-by-step process of how it came to a certain conclusion, said Lyle Berkowitz, MD, director of innovation at Northwestern Memorial HealthCare in Chicago. In other words, physicians can corroborate the AI findings and trust the conclusions.

The other types of artificial intelligence are machine learning and deep learning, the latter of which has grown exponentially since 2015. In both cases, programmers must pass immense amounts of data through the system for it to "learn" how to identify patterns. Unlike their rules-based cousin, machine learning and deep learning systems cannot give a straightforward explanation for how they arrived at a conclusion.

"The new AI systems have one Achilles heel — they can't explain themselves [to end-users]," Dr. Berkowitz said. Physicians ultimately have to decide whether to put their faith and their trust in the system, "which many physicians are uncomfortable doing," he added. 

Directing patient care according to "blind faith" runs contrary to physicians' training in evidence-based medicine, especially when the physician is fully accountable for the final patient outcome. Moreover, physicians' foundational promise to "first, do no harm," means patient safety is the highest priority. Handing over partial control over the outcome can make clinicians uncomfortable.

Christus St. Michael Health System in Texarkana, Texas, faced a similar ethical dilemma when it earned a CMS Innovation Grant to implement an AI-based nurse training program. Nurses were trained to recognize early warning signs of congestive heart failure and sepsis in Medicare beneficiaries in post-acute facilities, aiming to reduce hospital readmissions. The program included a computerized clinical decision support system that guided nurses through evidence-based protocols after nurses detected symptoms.

"[The program] has an opportunity to make a difference in community members' lives," said Chris Karam, president and CEO of Christus St. Michael Health System and CEO of Christus Health in Irving, Texas. "But the AI tool is purely educational and isn't used on the clinical floor."

The clinical decision support system only trained to nurses to recognize early warning signs. The AI was not deployed to help nurses actively recognize symptoms on the floor by calling their attention to certain vitals or evidence in the medical records. "There's just too much reluctance around using [AI] on patient care," Mr. Karam said. 

Physician perceptions of AI will determine its usability

Physician perceptions of artificial intelligence could be one of the greatest obstacles to AI adoption, panelists agreed. As the end-users of clinical technology, physicians play a central role in driving innovation efforts. The panelists debated what needs to happen for physicians to trust AI technology and routinely use it in their daily workflow.

Mark Gridley, president and CEO of Family Health Network in Chicago, suggested AI adoption will occur gradually as physicians and caregivers become increasingly familiar with AI software. "Think about IV pumps and defibrillators; clinical processes that used to be manual. Automated IV pumps and defibrillators were met with the same skepticism and reluctance physicians today express about AI," he said. When it comes down to it, he added, providers will learn to trust AI by using AI systems.

However, AI platforms must feature certain characteristics for physicians to use them at all, Dr. Berkowitz said. The painful adoption of EHRs taught IT developers the importance of designing technology around clinicians' preferred workflow. Technologies that introduce additional administrative steps or interrupt a physician's routine are more likely to fail, Dr. Berkowitz said.

"I continue to stress the importance of thinking about user-centered design and workflow, and how that's going to determine your [AI system] adoption rates and physician response," Dr. Berkowitz said.

Physicians benefit from using AI as a clinical assistant

The importance of reducing administrative burdens and improving physician efficiency cannot be understated, especially as the Association of American Medical Colleges expects a physician shortage of between 34,600 and 88,000 by 2025. Deploying AI systems as "clinical assistants" to improve physician efficiency and productivity emerged as a key theme throughout the panel discussion.

Each panelist said they were excited by AI's potential to improve clinician workflow by relieving physicians of tedious, clerical or manual processes, such as data entry or reporting. AI systems can accomplish clinical processes faster and with better accuracy than humans, helping physicians to work more efficiently, Mr. Sotos said. 

"How AI is going to play out in healthcare in the next few years will have a lot to do with serving physicians," Mr. Gridley added. "The initial AI systems will be used as physician servants, until physicians are comfortable with deep learning and more mature AI."

Get involved in lobbying regulatory bodies about AI early

Another factor influencing AI adoption rates in healthcare is the regulatory environment.

Healthcare moves at a slower regulatory pace compared to other industries. Current law does not explicitly define the FDA's powers to regulate medical devices that rely on artificial intelligence and machine learning. Although healthcare professionals generally regard federal regulations as burdensome, and therefore inhibitive of progress, they can also be necessary to support and encourage innovation, Mr. Gridley said.

Consider the passage of HIPAA in 1996. Scientists developing sophisticated algorithms in 1980 did not have access to medical data because health records were owned by the hospital, not the patient. HIPAA changed the ownership model: If the patient gives consent, the hospital is required to send the record to the patient or a designee. Regulations giving organizations legal means to exchange patient health information have spurred innovation, including the application of artificial intelligence in healthcare.

"Providers need to get involved and lobby political and governing bodies to adapt new parameters that allow us to embrace AI as an industry," Mr. Gridley said. "AI is coming, and if we don't stay involved in directing it, AI will happen to us rather than with us."   

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Featured Whitepapers

Featured Webinars