AHA warns 'deepfakes' could impersonate hospital execs

The American Hospital Association is warning hospitals and health systems to be on the lookout for cybercriminals who may impersonate their executives via deepfakes.

The National Security Agency, FBI, and Cybersecurity and Infrastructure Security Agency put out a notice Sept. 12 about the rise in state-sponsored hackers using artificial intelligence and machine learning to create fake media that looks highly realistic. The agencies said hackers impersonate company executives to hurt their brands or try to gain access to their networks.

"The U.S. government has warned in general of an increasing threat from 'deepfakes,' AI-generated synthetic audio, video or image media that cyber actors may use for deceptive purposes or as part of social engineering techniques," said John Riggi, the American Hospital Association's national adviser for cybersecurity and risk, in a Sept. 13 news release. "At this time, there does not appear to be widespread use of deepfakes targeting healthcare, but we should maintain vigilance and promote awareness in the workforce."

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Featured Whitepapers

Featured Webinars

/30116360/HR_HIT_300x250

>