AI algorithms have anti-Muslim biases

Published March 13, 2026 Updated March 13, 2026 09:06am

I AM a scholar and researcher at the University of California, Berkeley, working at the intersection of technology, law and discrimination. I recently performed an empirical experiment involving commercial artificial intelligence (AI) image-generation tools. In structured testing across 25 of the widely used paid AI image-generation and headshot platforms, I found a rather consistent pattern.

When photographs of hijab-wearing women were uploaded, 22 out of the 25 platforms removed the hijab entirely and replaced it with AI-generated hair. The remaining three produced inconsistent results, sometimes retaining a distorted or partial version of the head covering. The pattern appeared across multiple services, suggesting a systemic design issue rather than an isolated glitch.

The harms caused by AI systems are already well documented, from Amazon’s hiring algorithm discriminating against women to Grok generating harmful images that triggered international criticism. But one dimension has received far less attention, the erasure of religious identity. In these tests, the hijab was not mis-rendered or inaccurately drawn. It was removed altogether.

These findings raise concerns that go beyond technical errors. The users did not request the removal of the religious head covering, and during image processing there was no option to retain the hijab or to consent to its removal. The alteration occurred automatically. The consistency of the outputs also suggests possible exclusion or underrepresentation in the training data used to build these systems.

For Muslim women who wear it, hijab is a matter of dignity and religious identity, not simply a stylistic feature. When identity markers are silently erased, the issue is not only bias, but also structural exclusion, consent and accountability. If models trained on limited data are then deployed worldwide, the consequences extend far beyond a single image. What obligations do AI companies have when their products systematically reshape how certain groups appear? And what remedies are available when those systems are built in one jurisdiction and used worldwide?

Mahwish Moazzam
Berkeley, USA

Published in Dawn, March 13th, 2026

Opinion

Editorial

Chinese diplomacy
Updated 14 Mar, 2026

Chinese diplomacy

THERE are signs that China is taking a more active role in trying to resolve the issue of cross-border terrorism...
Fragile gains at risk
14 Mar, 2026

Fragile gains at risk

PAKISTAN is confronting an external shock stemming from the US-Israel war on Iran that few of the other affected...
Kidney disease
14 Mar, 2026

Kidney disease

ON World Kidney Day this past Thursday, the Pakistan Medical Association raised the alarm on Pakistan’s...
Delicate balance
Updated 13 Mar, 2026

Delicate balance

PAKISTAN has to maintain a delicate balance where the geopolitics of the US-Israeli aggression against Iran are...
Soaring costs
13 Mar, 2026

Soaring costs

FOR millions of households already grappling with Ramazan inflation, the sharp increase in petrol and diesel prices...
Perilous lines
13 Mar, 2026

Perilous lines

THE law minister’s veiled warning to the media to “exercise caution” and not cross “red lines” while...