Sex disparities exist in AI diagnostics, study suggests

When patients of one sex are underrepresented in training data used to develop a machine learning model, the algorithm yields less accurate diagnoses for patients of that sex, according to a new study published May 25 in The Proceedings of the National Academy of Sciences.

Advertisement

The study, led by researchers in Argentina, focused on artificial intelligence’s ability to analyze chest images to find 14 medical conditions, including pneumonia and heart enlargement. The researchers examined three open-source algorithms widely used in experimental medical research.

The researchers found that algorithms performed worse on patients whose sex was underrepresented in the training data. Their results highlight how critical it is for AI researchers to use training data that represents the characteristics of all the patients on whom the model will be tested.

“It’s such a valuable cautionary tale about how bias gets into algorithms. The combination of their results, with the fact that the datasets that these algorithms are trained on often don’t pay attention to these measures of diversity, feels really important,” said Ziad Obermeyer, MD, a machine learning researcher. 

More articles on artificial intelligence:
Elon Musk claims AI chip could ‘fix anything that’s wrong with the brain’
Google AI system can assess risk of AMD development
AI in the revenue cycle: How to use a data-driven approach to prioritize optimal processes for automation

Advertisement

Next Up in Health IT

Advertisement

Comments are closed.