Hospital and health system CIOs are reevaluating artificial intelligence governance and compliance strategies. The shift follows a Dec. 11 executive order signed by President Donald Trump that blocks states from creating AI regulations outside existing federal frameworks.
The order directs federal agencies to challenge certain state AI laws and ties future federal funding eligibility to compliance with federal policy. As federal standards evolve, CIOs say the move is prompting closer scrutiny of compliance mapping, governance accountability and vendor oversight.
For Muhammad Siddiqui, CIO at Reid Health in Richmond, Ind., the first area under review is how existing compliance efforts align with state-level AI requirements that could now be challenged or preempted.
“The first thing I’m reassessing is how our compliance work maps to state-specific AI laws. I’m especially looking at things like bias assessments, transparency for high-risk tools like clinical decision support, and data privacy safeguards,” Mr. Siddiqui told Becker’s. “I want to understand which of these might be challenged or preempted by the new executive order and the upcoming DOJ litigation.”
He said the order is also triggering deeper conversations with vendors about risk allocation and preparedness as regulatory oversight shifts.
“This shift brings both opportunity and risk. On the opportunity side, a clearer federal framework could make vendor contracts simpler and help us move faster with AI that improves outcomes and efficiency,” Mr. Siddiqui said. “At the same time, the risk is real. If strong federal standards don’t come in quickly, we could see gaps in protections. That opens the door to more liability and potential safety concerns.”
At Sky Lakes Medical Center in Klamath Falls, Ore., CIO and CISO Rick Leesmann said the executive order reinforces the organization’s existing governance strategy but raises expectations around execution.
“As we formalize the first phase of AI governance, the first thing we are reassessing is how we anchor decisions in clinical safety, data security and measurable outcomes rather than reacting to a fragmented regulatory landscape,” he told Becker’s. “Clearer federal signals reduce uncertainty, but they raise the bar on execution.”
At Tacoma, Wash.-based MultiCare Health System, Interim CIO Scott Waters said the executive order is prompting renewed focus on how federal frameworks intersect with internal policies around high-risk AI use cases and patient data protection.
“At MultiCare, alignment with the NIST AI Risk Management Framework for trustworthy AI is foundational to our approach,” Mr. Waters told Becker’s. “NIST’s emphasis on transparency and risk evaluation in high-risk AI use cases aligns well with our organization’s risk tolerance and with healthcare’s mission and commitment to patient safety.”
He added that MultiCare’s AI governance committee currently does not permit AI models to be trained on protected health information without significant safeguards and controls, a policy that may need to be reevaluated.
“If federal frameworks evolve, we will need to reevaluate this policy to ensure continued compliance and patient safety,” Mr. Waters said.
For Darrell Bodnar, CIO at Whitefield, N.H.-based North Country Healthcare, the executive order is shifting attention away from compliance checklists and toward clearer accountability and risk ownership within health systems.
“Reducing state-level involvement is a positive step, as inconsistent and politically driven approaches across states have historically created fragmentation and hesitation rather than clarity for health systems,” he told Becker’s. “A unified federal direction creates real opportunity by enabling more confident vendor engagement and responsible deployment, particularly for rural and multistate organizations.”
As federal agencies begin reviewing state AI laws and outlining enforcement mechanisms, CIOs said health systems will be watching closely to see whether the order delivers on its promise of clarity — or introduces new complexities in AI governance.