Healthcare workers sue Amazon for allegedly recording PHI: 5 things to know

Four healthcare workers have filed a class-action lawsuit against Amazon, alleging that their Alexa devices recorded their HIPAA-protected conversations, according to a July 2 Newsweek report.

Five things to know:

  1. The suit, filed in a Washington district court, claims that "Amazon's conduct in surreptitiously recording consumers has violated federal and state wiretapping, privacy, and consumer protection laws."

  2. The lawsuit alleges that Amazon failed to disclose at the time the plaintiffs bought their devices that it records and stores interactions with Alexa. The devices may have recorded and stored conversations the healthcare workers had with patients that contained protected health information, the lawsuit alleges.

  3. The lawsuit cites an Evanston-based Northwestern University study demonstrating that certain phrases can alert Alexa and trigger it to start recording. Alexa may also be triggered if the speakers are talking in Spanish or if they spoke with an accent not typical to native-born American speakers.

  4. A spokesperson from Amazon told Newsweek that "Alexa and Echo devices are designed to only detect your chosen wake word (Alexa, Amazon, Computer, or Echo). No audio is stored or sent to the cloud unless the device detects the wake word (or Alexa is activated by pressing a button)."

  5. Alexa's blue light will pop up, indicating that the device has been awoken, the spokesperson said. Alexa users can opt out of having their voice recordings included in the recordings that get reviewed. Users also can elect to not have their recordings saved at all or to have them deleted every few months, they said.








Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Featured Whitepapers

Featured Webinars