News, Stories, Issues, Opinions, Data, History

AI Systems Show Deep Racial Bias Across Critical Sectors

A sweeping review of AI performance across industries reveals that racial and gender disparities remain deeply embedded in the algorithms increasingly used to make decisions about employment, medical care, and even law enforcement.

Facial recognition technology shows some of the starkest inequities. Error rates range from just 0.8% for light-skinned men to 34.7% for dark-skinned women—a nearly forty-fold difference. Meta-analyses cited in the report confirm that accuracy for dark-skinned populations often falls between 25% and 35%. These gaps stem from training datasets dominated by lighter-skinned faces and camera hardware historically calibrated for light skin tones.

Hiring algorithms also exhibit sharp racial bias. In a University of Washington study, AI models preferred white-associated names 85% of the time, compared with only 9% for Black-associated names. At no point did Black male-associated names outrank white male-associated names. Intersectional patterns revealed further disparities: among Black-associated names, female names were preferred 67% of the time, compared to just 15% for Black male names.

Medical AI systems show subtler but clinically significant disparities, with a 15–25% performance drop for disadvantaged racial and socioeconomic groups. The report notes that these gaps can lead to missed diagnoses and poorer prognostic accuracy. Bias also emerges when clinical data reflect existing racial inequities, such as lower testing rates for Black patients with conditions like sepsis, embedding unequal care into algorithmic recommendations.

Even large language models display uneven empathy. GPT-4 showed 2–15% lower empathy for Black posters seeking mental-health support and 5–17% lower empathy for Asian posters.

Together, these findings illustrate how AI can amplify long-standing inequities without intentional safeguards.

See: “AI Algorithm Bias Detection Rates By Demographics 2025–2026” (October 1, 2025)

Topics