The Uks Research Centre on Wednesday announced the launch of its Uks AI (Beta) platform, describing it as an artificial intelligence tool that calls out gender bias in media content, “whether it’s hiding in plain sight or lurking between the lines”.
A media monitoring and advocacy organisation, Uks, for over two decades, has been engaged with the media on how to report women’s issues in general, and cases of violence against women in particular.
In a press release issued today, seen by Dawn.com, the organisation said the tool was built particularly for journalists, editors and reporters to address a “persistent challenge” in media creation: the unconscious perpetuation of gender stereotypes that shaped how society viewed and valued women.
Uks said the tool scanned text for both glaring and subtle biases, offering “immediate, actionable feedback” to media professionals.
It said the tool was developed on the Global Media Monitoring Project (GMMP) framework, explaining that the tool dissected content through four critical lenses: clear stereotypes such as portraying women solely as caregivers, subtle biases such as consistently showing men as experts, neutral portrayals and content that actively challenged gender stereotypes.
“The system helps journalists, editors and content creators spot problematic patterns in their writing from obvious stereotypes to those subtle word choices that can unintentionally reinforce gender biases. It’s designed to be practical, offering specific suggestions for more balanced language and representation,” the press release said.
Uks said it had spent years documenting how medial representation shaped public perception of gender roles, adding that the AI tool transformed that research into actionable insights for newsrooms and content creators.
The press release said the beta version was now available for testing, and the organisation welcomed feedback from media professionals to help refine and improve its product.
“Most of us think we’re writing without bias, but the reality is different,” the press release quoted the organisation’s chief Tasneem Ahmer as saying. “This tool acts like a friend looking over your shoulder, pointing out the biases we might miss in our rush to meet deadlines or tell a story.”
He further said: “We want this to be something that actually helps people write better, more inclusive content,” adding that “the more people use it and tell us what works and what doesn’t, the better it will become.”