
I AM a scholar and researcher at the University of California, Berkeley, working at the intersection of technology, law and discrimination. I recently performed an empirical experiment involving commercial artificial intelligence (AI) image-generation tools. In structured testing across 25 of the widely used paid AI image-generation and headshot platforms, I found a rather consistent pattern.
When photographs of hijab-wearing women were uploaded, 22 out of the 25 platforms removed the hijab entirely and replaced it with AI-generated hair. The remaining three produced inconsistent results, sometimes retaining a distorted or partial version of the head covering. The pattern appeared across multiple services, suggesting a systemic design issue rather than an isolated glitch.
The harms caused by AI systems are already well documented, from Amazon’s hiring algorithm discriminating against women to Grok generating harmful images that triggered international criticism. But one dimension has received far less attention, the erasure of religious identity. In these tests, the hijab was not mis-rendered or inaccurately drawn. It was removed altogether.
These findings raise concerns that go beyond technical errors. The users did not request the removal of the religious head covering, and during image processing there was no option to retain the hijab or to consent to its removal. The alteration occurred automatically. The consistency of the outputs also suggests possible exclusion or underrepresentation in the training data used to build these systems.
For Muslim women who wear it, hijab is a matter of dignity and religious identity, not simply a stylistic feature. When identity markers are silently erased, the issue is not only bias, but also structural exclusion, consent and accountability. If models trained on limited data are then deployed worldwide, the consequences extend far beyond a single image. What obligations do AI companies have when their products systematically reshape how certain groups appear? And what remedies are available when those systems are built in one jurisdiction and used worldwide?
Mahwish Moazzam
Berkeley, USA
Published in Dawn, March 13th, 2026





























