UNILAD
unilad logo

To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

AI’s Ability To Predict Race From X-Rays Alone Sparks Concerns
Featured Image Credit: Alamy

AI’s Ability To Predict Race From X-Rays Alone Sparks Concerns

A new study has found that AI diagnostic systems could produce racially biased medical results which could result in adverse health issues.

A new study has found that artificial intelligence (AI) diagnostic systems could produce racially-biased medical results that could result in adverse health issues.

Scientists and researchers at Harvard University have found that AI programs can determine a person’s race with more than 90% accuracy from an X-ray, a concept that is concerning scientists.

The reason for the concern is due to the fact that no one can explain how the AI programs are doing this.


The study's authors wrote that the AI systems are using race as a part of their process for medical diagnosis and the subsequent treatment that follows, but it could be to the detriment of the patient’s health.

Marzyeh Ghassemi, an assistant professor at Massachusetts Institute of Technology and co-author of the study, said: “I honestly thought my students were crazy when they told me,” according to Boston.com.

The aim of the study was to determine the degree to which AI systems could detect race from X-rays, and to find out more about how they are actually able to do this.

The research team trained the AI programs using standard X-rays and CT scans of different parts of the body. Each image was labelled with the race self-reported by the patient, but contained no traces of racial markers such as skin colour, bone density or hair texture.

The team found that the AI systems were astonishingly able to determine the race of the patient with an accuracy of more than 90%.


The unsettling tone that surrounds this new discovery isn’t that the AI systems can detect race to an extremely accurate degree, rather that the systems have been found to perform poorly as a result of racial bias.

The authors of the study commented: “We emphasise that the ability of AI to predict racial identity is itself not the issue of importance, but rather that this capability is readily learned and therefore is likely to be present in many medical image analysis models.”

Humans are not able to detect what features in an image mean that the AI program can detect a patient's race, and in addition to this the systems are just as accurate when the images are extremely degraded.

This results in a dilemma of how humans can create an AI system using medical imaging that does not have a racial bias, or if this is even possible to do.

Ghassemi added that her best guess as to how AI systems are capable of this is that the medical imaging is somehow recording the level of melanin in a patient’s skin, but doing so in a way that humans have never picked up on before.

If you have a story you want to tell, send it to UNILAD via [email protected]  

Topics: Science, Technology