Is your system underestimating the work behind AI?


Artificial intelligence is viewed as innovative technology that simplifies hard or tedious tasks. Yet, the work of the people who support AI by doing the behind-the-scenes labor is often overlooked, according to a May 4 report by the Cambridge, Mass.-based MIT Sloan School of Management.

For AI to be ethical, it needs to account for the people who make digital innovation possible. At MIT Technology Review's EmTech Digital conference, Madeleine Clare Elish, PhD, a senior research scientist at Google, gave insight on how to make AI more effective and more ethical.

Three things to know:

1. Consider how AI will be integrated into a workplace.

When new technology is introduced, the healthcare staff often have to weave that technology into existing work practices, power dynamics and cultural contexts. Developers often overlook the people who have to make it work. Those creating AI systems need to allocate resources toward supporting the workforce using the new product from the beginning to the end.

2. Don't forget about "ghost workers" behind the scenes.

AI requires behind-the-scenes workers doing tasks such as transcribing audio or labeling images, who are often invisible to the end user. These employees are called ghost workers. Dr. Elish's research shows that ghost workers often earn below minimum wage and have limited opportunities for career growth and development.

To take these workers into account, healthcare leaders need to understand systemic challenges they face and how AI programs can be used to help these employees. For example, implementing an AI program that can detect when an employer is being unfair to an employee  by providing too much negative feedback and can nudge the manager to reconsider.

3. Ask who's not at the table, and whom AI might harm.

Abeba Birhane, a PhD candidate in cognitive science at University College Dublin, said a recurring theme across AI tools is that "individuals in communities that are at the margins of society, people who are the most vulnerable, are always the ones who pay the highest price."

Things like facial recognition systems, healthcare algorithms and privacy violations tend to disproportionately affect people of color, immigrants and the LGBT community. The people creating AI systems tend to be from privileged backgrounds and may not understand the potential problems created by their systems. Ethical AI should be an integral part of the creation process, from ideation to deployment.

Copyright © 2021 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.


Featured Whitepapers

Featured Webinars