News, Stories, Issues, Opinions, Data, History

AI Cancer Tools Show Bias Against Minority Patients

Artificial intelligence systems designed to diagnose cancer are introducing new racial disparities into medical care, according to a study from Harvard Medical School researchers. The AI models can infer demographic details directly from tissue slides and perform less accurately for certain racial groups.

Researchers evaluated four commonly used pathology AI models and found consistent performance gaps. The systems struggled to distinguish lung cancer subtypes in African American patients and showed reduced accuracy when detecting breast, renal, thyroid, and stomach cancers in some demographic groups. Overall, these disparities appeared in roughly 29 percent of the diagnostic tasks analyzed.

Kun-Hsing Yu, senior author and associate professor of biomedical informatics at Harvard Medical School, called the findings surprising since pathologists consider reading demographics from a pathology slide “mission impossible” for humans. The research team identified three main contributors to the bias.

Training data are often uneven because tissue samples are easier to obtain from some demographic groups than others, making it harder for AI models to accurately diagnose cancers in underrepresented populations. The problem runs deeper than simple data imbalance, however. Models performed worse for certain demographic groups even when sample sizes were similar.

AI systems can also detect subtle molecular differences across demographic groups, such as mutations in cancer driver genes, and use them as shortcuts to classify cancer type. This reduces accuracy in populations where those mutations are less prevalent.

To address these disparities, researchers developed FAIR-Path, a framework that reduced diagnostic disparities by about 88 percent. Yu emphasized that recognizing and correcting bias in medical AI is critical since it can directly influence diagnostic accuracy and patient outcomes.

See: “AI detects cancer but it’s also reading who you are” (December 18, 2025)

Topics