Can affective computing fill the empathy gap?

Paul Balagot, chief experience officer, precisioneffect -

As the healthcare system continues to come under increased pressure to maximize throughput and treat patients more efficiently, there is a growing body of work in Artificial Intelligence (AI) to optimize everything from diagnosing a patient to designing treatment plans.

DeepMind, a leading company in AI who is now part of the Alphabet group is looking for ways to accelerate the process of getting patients from diagnosis to treatment faster.

Babylon Health, a digital startup is looking to make healthcare increasingly accessible to everyone by using medical accredited AI to provide medical advice through a patient’s mobile phone.

These are just a couple examples of how AI is transforming the healthcare landscape. However in the push to make healthcare increasingly efficient, it is also coming at the expense of empathy.

With the ever-growing demands on our healthcare professionals, their ability to have the time to make an empathetic connection with patients is increasingly being challenged. A lot has been written on this concern and it doesn’t seem to be going away anytime soon. Most recently the AMA co-sponsored the 8th annual Patient Experience Summit, where the theme was “Empathy by Design” and had a distinct focus on “exploring innovative ways to create and sustain a human-centric environment.”

This leads me to a slightly different field of AI, the field of Affective computing which is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. It is an interdisciplinary field spanning computer science, psychology, and cognitive science.

This growing field of study is looking to develop emotionally intelligent machines that have the potential to demonstrate empathy. Affective computing is being studied to detect human emotions and nuance so as to create more natural human experiences and interactions. Could the technology get so good where AI applications can make a patient feel real empathy?

The ability for a machine to connect empathetically was depicted in the 2013 film “Her” where Joaquin Phoenix’s character Theodore Twombly develops feelings for an intelligent computer operating system. In the animated film “Big Hero 6” that was released in 2014, Baymax, a personal healthcare companion robot had the ability to recognize emotion and provide care.

When you see these depictions, it can be easy to dismiss it as all science fiction but currently there are companies researching ways for machines and robots to detect human emotion through analyzing facial expressions, non-verbal cues, inflections in voice and other stimuli.

SoftBank Robotics has developed a human-shaped robot named Pepper. They describe Pepper as “ the first humanoid robot capable of recognizing the principal human emotions and adapting his behavior to the mood of his interlocutor."

We are still at the tip of the iceberg with how AI and Affective computing can transform healthcare but there is a strong body of evidence that shows how empathy in the patient care setting translates to better outcomes. And so if the empathy gap continues to widen, the healthcare industry could be increasingly motivated to bring Affective computing from the movies to the bedside.

The views, opinions and positions expressed within these guest posts are those of the author alone and do not represent those of Becker's Hospital Review/Becker's Healthcare. The accuracy, completeness and validity of any statements made within this article are not guaranteed. We accept no liability for any errors, omissions or representations. The copyright of this content belongs to the author and any liability with regards to infringement of intellectual property rights remains with them.

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.