The challenges ChatGPT poses to healthcare, according to CIOs

Lack of real-time data, regulations, bias and accuracy are some of the obstacles hospital and health system CIOs said are the greatest challenges in working with ChatGPT in healthcare. 

No real-time data

"ChatGPT is a powerful language model and holds great promise for transforming healthcare in areas such as customer service, education and overall healthcare experiences; however, it is important to be aware of the limitations when utilized in healthcare settings, like lack of real-time data," Sunil Dadlani, executive vice president and chief information and digital transformation officer of Morristown, N.J.-based Atlantic Health System, told Becker's.

ChatGPT is trained on preexisting knowledge and data, but is currently sourced only through September 2021, meaning that the technology may "not have access to the latest research or medical advancements, limiting its ability to provide up-to-date information," according to Mr. Dadlani. 

Robert Eardley, CIO of University Hospitals in Cleveland, also said that "the syntax of a response can come from a LLM, but the content of the message is limited to trusted data and knowledge held by the organization." 

With the lack of real-time data, this could lead to "unintended consequences" when healthcare professionals are using it in the clinical setting, according to Laura Smith, CIO of West Des Moines, Iowa-based UnityPoint Health.  

"If the data used to train it [ChatGPT] has bias, is incomplete or is inaccurate, the ramifications of using that data to augment or even perform decision-making could lead to unintended consequences," Ms. Smith told Becker's

The risk of a biased AI-bot

Biased information is still an issue for ChatGPT, which was introduced in November by its developer, OpenAI. 

"[ChatGPT] is trained on a dataset of text that reflects the biases of the real world, which means that it can sometimes generate biased or discriminatory content," said Zafar Chaudry, MD, senior vice president and CIO at Seattle Children's. 

Raymond Lowe, senior vice president and CIO of Los Angeles-based AltaMed, echoed Mr. Chaudry's concerns, stating that healthcare professionals must be "respectful of cultural sensitivities and ethical approaches to care" to ensure that diverse patient populations are not marginalized or disadvantaged in any way by the new technology. 

Currently, UC San Diego Health is piloting ChatGPT's ability to answer patient inquiries in the EHR and is deploying a team to examine the AI for potential for bias and worsening of health inequities. 

Not enough privacy 

"As ChatGPT can access and process personal health information, privacy concerns arise," Mr. Dadlani said. 

Mr. Dadlani said there is a possibility that ChatGPT could be misused or mishandled, which could potentially compromise patient privacy and confidentiality.

According to OpenAI's privacy policy, ChatGPT is able to collect "personal information relating to you," if you create an account to use its services or if a person interacts with its social media accounts or communicates with the company. 

No regulations on ChatGPT

In addition, there is no regulation for the use of ChatGPT in healthcare.

"This absence of clear guidelines raises questions about the appropriate and ethical use of the technology in healthcare settings," Mr. Dadlani said. 

The lack of guidelines has caused some hospital and health system CIOs to take on the responsibility of developing their own regulations around the use of ChatGPT at their organizations. 

Darrell Bodnar, CIO of North Country Healthcare, based in Whitefield, N.H., told Becker's that "CIOs must take a position and develop policies around the appropriate use of ChatGPT and all AI language models and services."

Currently, Senate Majority Leader Chuck Schumer is one of the first to release a framework for potential legislation regulating artificial intelligence, Politico reported June 21, although the publication called any specific legislative details "murky."

Despite limitations, there is potential 

Despite the new technology's limitations, hospital and health system leaders remain optimistic about how ChatGPT can be used to free up some of their providers' time. 

Many have already begun piloting it to answer patient inquiries in patient portals, as well as using it to build custom chatbots to respond to questions specific to their health system.

"In general, ChatGPT can help with the overall patient experience with basic queries, as well as leveraging AI portions to help with repetitive tasks, such as claim processing," Mr. Lowe said. "We are intrigued by the potential integration of ChatGPT and how physicians and healthcare providers can best utilize its technologies to further support care of patients with the continued evolution to ChatGPT 4.0 and beyond."

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Featured Whitepapers

Featured Webinars

/30116360/HR_HIT_300x250

>