Sponsored

4 ways Cleveland Clinic, Saint Luke’s & Texas Children’s are easing AI anxiety

Advertisement

AI is no longer a distant promise in healthcare. It’s here, reshaping how care is delivered, decisions are made and data is used. Yet as innovation accelerates, so do the questions: How can health systems safeguard privacy, uphold clinical judgment and support — not replace — their workforce? During Becker’s 10th Annual Health IT + Digital Health + RCM Meeting, a panel of healthcare leaders shared how their organizations are moving beyond the hype to build trust, transparency and tangible value with AI.

The session, “AI Anxieties: Strategies to Reduce and Educate,” was sponsored by AKASA and featured insights from three leaders across informatics, analytics and finance:

  • Anoop Vijayan, director of data, analytics, governance, AI and data science at Texas Children’s Hospital (Houston)
  • Jeffrey Sattler, PharmD, DO, hospitalist, Epic system physician builder and physician advisor at Saint Luke’s Health System (Kansas City, Mo.)
  • Bob Gross, executive director of financial decision support and analysis at Cleveland Clinic

Here are four key takeaways from their discussion.

1. Educate early and often to reduce resistance.

Panelists agreed that staff anxiety often stems from a lack of understanding. Whether it’s ambient AI or generative tools, team members are more open to adoption when they’re informed early.

Texas Children’s Hospital holds monthly literacy sessions to set organizational expectations and demystify AI. “They may not like what they hear, but that’s OK,” Mr. Vijayan said. “It is important for them to understand this is where we stand as an organization. This is what we think we should or should not do. It’s always good for us to tell them, rather than them having to go and find some of these things on their own.”

The education-first approach has paid off for the health system, as Mr. Vijayan noted he sees staff members’ interest in AI innovation growing.

“When we started this journey in 2016–2017, we had to go out and ask for proof-of-concept ideas. We hardly got five ideas for a proof of concept,” Mr. Vijayan said. “Today, we are having to say ‘no’ because of the number of ideas that we are getting … It’s almost like they know that AI is part of the organization — that it’s not going away.”

2. Start with real problems, not technology.

Panelists agreed that trust builds when AI is framed as a solution to well-defined problems, not as a shiny new tool.

“The problem always comes first, and the tech comes after,” Mr. Vijayan said. “It is essential that we have a strategic imperative of why we are doing AI. Is it for improving patient experience, to be more productive, for financial ROI?” He advised developing a framework that shows how the AI project falls into the organization’s strategic imperatives.

“It is easier to explain why we’re doing it, and we get more trust and engagement from the stakeholders as to, this is a problem we’re trying to solve, and this is a solution that best fits and it happens to be an AI solution,” he added.

At Cleveland Clinic, early AI efforts focused on revenue cycle improvements. “It’s the gold mine,” Mr. Gross said. “Not only can you make improvements that impact revenue yield, but there is so much inefficiency that it makes for a wonderful use case to apply any AI — whether we talk about traditional problem-solving all the way up through large language models.”

Despite myriad opportunities for applying AI in the revenue cycle, Mr. Gross clarified that effective use cases must be carefully mined.

“You have to tunnel for it,” he said. “It’s in there, but you have to go harvest it … There’s no getting around it that you have to make investment in technology — in people, in time and resources — to find opportunities within the revenue cycle. But the benefit, the revenue yield, the cost efficiencies that can be gained generally pay for any of those investments several times over.”

3. Upskill clinicians with a human-first approach.

Rather than replacing clinical staff, leaders are using AI to give time back to caregivers. At Saint Luke’s Health System, that means an intentional focus on freeing clinicians to reconnect with their “why” — the purpose that drew them to healthcare in the first place.

“The goal is to get clinicians back to human relationships and higher critical thinking, to try to decrease friction,” Dr. Sattler said.

Saint Luke’s is deploying an ambient voice tool to draft notes in real time. One of Dr. Sattler’s initially skeptical physician colleagues became an advocate within a week, despite some of the upfront process changes.

“There was a little bit of give. He had to change his workflow, make sure before he walked in the room that he had the chart pulled up, and he had to get patient consent,” Dr. Sattler said. “Initially, he started doing it with new patient visits. By the end of the week, he was using it on everybody.”

4. Stay agile with governance and partnerships.

Governance is essential, but panelists noted that rigid frameworks can’t keep pace with rapid AI evolution.

“There was a big push in healthcare to have AI governance and AI guidance, and now, with agentic AI, we had to throw all the framework out because agentic AI just accelerated it to a different level,” Mr. Vijayan said. “So you have to have your governance and guidance adapt to the technology as well.”

To achieve this, Mr. Vijayan said Texas Children’s leveraging two approaches. “We have a t-shirt size for what is the problem we’re trying to solve, and for a risk versus benefit profile,” he added. “That’s how we try to manage safety and agility.”

Cleveland Clinic partners with outside vendors like AKASA, to navigate the complexities of technology innovation. For any internal development effort, Mr. Gross said he generally cautions against trying to “go it alone” in the large language model space, or at least recognize that what is developed has a short shelf life.

“That led us to partnering with outside organizations who could not only demonstrate an expertise in the technology but also an understanding of healthcare operations — I think that’s really important,” Mr. Gross said.

Ultimately, the panelists emphasized that AI is a tool, and communicating this effectively to healthcare teams is paramount.

“You have to put AI in its place,” Mr. Gross said. “It’s created by people, for people, and it’s not as scary as we make it out to be … The reality is that even large language models, as sophisticated as they are, are still just tools and they will change the way we do our work — but they’ll change it for the better.”

Advertisement

Next Up in Strategy

Advertisement