ARTIFICIAL intelligence is no longer a distant promise whispered in tech conferences. It is here, in our phones, our banks, our classrooms, in the decisions that quietly shape our lives. It is often described as the new electricity. But electricity lights homes equally. AI does not.
Across the world, governments are moving from experimenting with AI to embedding it in the architecture of governance. Pakistan is no exception, and rightly so. At Indus AI Week 2026, policymakers, entrepreneurs and technologists gathered with a sense of optimism. The National Artificial Intelligence Policy 2025 sets out an ambitious roadmap for skills development, research ecosystems and responsible adoption.
The future is arriving fast. But the future does not arrive evenly.
Consider Shazia. She lives in a small village. She shares a mobile phone with her husband. She checks her messages when he is home. She has never appeared in a dataset as an independent digital citizen. Yet automated systems in taxation, social protection, employment screening and public services are increasingly being designed to ‘serve’ citizens like her.
The future is arriving fast. But the future does not arrive evenly.
I often think about women like Shazia when we speak about AI readiness and digital transformation. Because AI does not operate in isolation. It learns from the society it reflects. And when society is unequal, algorithms absorb that inequality quietly, and at scale.
The numbers tell a sobering story. According to UNDP’s National Human Development Report 2023, only 26 per cent of women in Pakistan use the internet, compared to 47pc of men. In rural areas, just 7pc of women are online; 83pc say male family members influence or control their access to mobile phones.
These are not just statistics. They determine who becomes visible to digital systems and who does not. AI models are trained on digital footprints: mobility patterns, transaction histories, search behaviour, service usage. When women’s economic participation and daily realities are underrepresented in data, algorithms treat those absences as normal. Bias does not need malicious intent to become structural. It only needs to be encoded into the system.
Pakistan’s AI policy speaks the language of inclusion. It commits to expanding AI training for women and underserved communities. It promotes ethical principles. These commitments matter. But attending training is not the same as shaping governance.
Responsible AI cannot remain aspirational. Without binding safeguards such as bias audits, gender impact assessments, structured representation of women, gender expertise in oversight bodies, inclusion risks becoming rhetorical. As eligibility scoring and administrative decisions move into automated systems, institutional power shifts into code. If gender perspectives are absent at the design stage, exclusion becomes systematic rather than accidental.
This is not a technical glitch. It is a governance choice.
And the consequences are both economic and social. Globally, generative AI skills command wage premiums of up to 36pc, according to the World Economic Forum. Yet women hold only around 22pc of AI-related roles, according to Unesco. If Pakistan’s AI transition expands high-paying digital sectors without correcting participation gaps, inequality will widen rather than narrow.
We have seen this story before. MIT research found that commercial facial recognition systems were significantly less accurate for darker-skinned women than for lighter-skinned men, not because programmers intended discrimination, but because training data reflected existing bias. Technology scales what it is fed.
Pakistan already faces governance and data coordination challenges reflected in global AI readiness assessments. If gender is not embedded into these frameworks from the outset, it will become an afterthought, and afterthoughts in code are costly.
But there is still a window. Before procurement rules solidify; before datasets are locked into automated workflows; before institutional design becomes path dependent. Embedding gender considerations into AI readiness assessments is not about optics. It is about structural safeguards: examining dataset composition, oversight mechanisms, audit capacity, and disaggregated data standards before systems scale nationally.
Shazia may never attend an AI Week. She may never write a line of code. But automated systems designed today will shape the services she receives, the benefits she qualifies for, and the risks she is scored against. Pakistan will surely adopt AI. The question is whether it will automate inequality or design governance strong enough to prevent it. Because once inequality is written into code, it becomes far harder to erase.
The writer is deputy resident representative, UNDP Pakistan.
Published in Dawn, March 8th, 2026




























