Promoting AI

Published April 17, 2023
The writer is a political and integrity risk analyst.
The writer is a political and integrity risk analyst.

THE launch last week of a task force on Artificial Intelligence to spur national development is welcome news. Its goal is to develop a roadmap for AI adoption in governance, healthcare, education and business.

It should be more ambitious, considering the role of AI in energy, housing, transport, etc. One assumes the task force will consider both opportunities and risks. But in its findings it should also recognise that successful AI adoption is intertwined with Pakistan’s broader political trajectory.

The PML-N has been beating the AI drum for some time, having set up the National Centre for AI in 2018, which trains students in AI, robotics, cybersecurity, etc. Its narratives have somehow leapfrogged the AI-as-job-killer story into a pitch for harnessing youth-led innovation and boosting economic competitiveness.

Planning Minister Ahsan Iqbal projects a fantasy vision, in which the government hands out laptops, and young people develop AI programmes and bring in dollars.

To be fair, the fantasy has some tendrils: 25,000 IT graduates are added to our workforce annually, and 85 million Pakistanis subscribe to 3G/4G cellular service. According to Tracxn, there are 92 AI startups in Pakistan, ranging from companies supporting precision agriculture to SME lending and women’s reproductive health awareness.

There’s no doubt that Pakistan must embrace AI to tackle the multidimensional crises its economy and bureaucracy face. Done right, AI improves efficiency and productivity, and allows emerging economies to bypass clunkier technologies.

Interestingly, the task force was launched days after over 1,000 tech leaders and researchers signed an open letter calling for a moratorium on developing advanced AI systems because — in an unregulated form — they present “profound risks to society and humanity”.

Those supporting the moratorium until “shared safety protocols” are agreed, depict a world in which AI systems destroy the global financial order, spark nuclear war, or remotely program labs to develop deadly viruses. Short-term concerns are arguably more relevant, including the implications of AI algorithms for individual rights, equality and political polarisation.

Safe AI needs the ingredients of a sound democracy.

When designed poorly (or nefariously) or fed bad data, AI systems can develop discriminatory, coercive or manipulative behaviour. For example, facial recognition technologies have demonstrated ethnic biases, while a test version of the AI chatbot GPT-4 could be swayed to feed users information about how to buy illegal guns. The role of AI algorithms in pushing disinformation on social media is well known.

The moratorium idea has met with criticism, primarily because it isn’t enforceable. Few in the West would trust tech companies to self-report, and fewer would believe that China would cease all AI development, voluntarily surrendering a competitive edge.

There are growing calls for government regulation instead (despite the acknowledgement that hapless regulators are playing catch up, with many governments — our own included — still struggling to pass adequate data privacy and protection laws).

The debate is a reminder that tech is only as good as the societies and political systems in which it is developed and deployed. And this is where the plan to make Pakistan AI-enabled comes up against the current political turmoil.

Safe and ethical AI requires the basic ingredients of a sound democracy: transparency, rule of law, accountability, respect for human rights, equality and inclusion. In our current context, these are hard to come by. The main pitfalls of AI have been highlighted in the political arena, where al­­gorithms have been used to manipulate swing voters, spread deep fakes and generate extreme political arguments to drive polarisation. Our leadership is willing to manipulate the Constitution to retain power — can you imagine what they would do with algorithms?

The media regulator’s approach to the airwaves — crude censorship; arbitrary rewriting the rules to benefit the sitting government’s agenda; opaque decision-making — also rings alarm bells for how AI oversight would play out in Pakistan — but with far more devastating effect (one can imagine service delivery algorithms excluding marginal populations to benefit incumbents’ constituents).

Pakistan must prepare for a world in which AI is the norm. But we must understand that to reap the benefits of these technologies, and not just suffer their harms, we need to build the resilience of our democracy. That also includes improving citizen awareness, both through boosting information rights, and prioritising critical thinking in education — all issues currently anathema to our de facto authoritarian state.

In that spirit, I invite discerning readers to guess whether I wrote this column, or if I asked ChatGPT to generate the text?

The writer is a political and integrity risk analyst.

Twitter: @humayusuf

Published in Dawn, April 17th, 2023

Opinion

Editorial

Border clashes
19 May, 2024

Border clashes

THE Pakistan-Afghanistan frontier has witnessed another series of flare-ups, this time in the Kurram tribal district...
Penalising the dutiful
19 May, 2024

Penalising the dutiful

DOES the government feel no remorse in burdening honest citizens with the cost of its own ineptitude? With the ...
Students in Kyrgyzstan
Updated 19 May, 2024

Students in Kyrgyzstan

The govt ought to take a direct approach comprising convincing communication with the students and Kyrgyz authorities.
Ominous demands
Updated 18 May, 2024

Ominous demands

The federal government needs to boost its revenues to reduce future borrowing and pay back its existing debt.
Property leaks
18 May, 2024

Property leaks

THE leaked Dubai property data reported on by media organisations around the world earlier this week seems to have...
Heat warnings
18 May, 2024

Heat warnings

STARTING next week, the country must brace for brutal heatwaves. The NDMA warns of severe conditions with...