WE live, move and have our being in technology. Formally, technology is the application of our understanding of the physical world to achieve practical goals. Informally, it can be defined as anything that helps us become lazy. It is often Janus-faced. It gives and it takes. Its power to connect often comes at the price of privacy; its speed may be paid for by the environment. Technology’s ubiquitous and double-edged nature should alert us to the need for thinking about it. However, it is rare to find discussions on this topic. This may be because technology reveals itself to us in use, like magic, forcing us to evaluate it practically here and now. The rapid adoption of AI is mainly because of its capacity to show benefits in a range of fields. Technology’s spell causes flight from thinking, which, in fact, makes it even more important to step back and reflect.
Technologies are not just an assemblage of buttons, dials, screens, cables — things that we see. They are, more importantly, social practices. This is because the use of any technology affects human life — relationships, communication, finances and power dynamics. Emails, for example, have not simply taken messages from point A to B, they have fundamentally altered our ways of knowing and communicating, our relationships, expectations and even values.
Social practices, in turn, are shaped by the broader socioeconomic context. The use of computers in education, for instance, is rooted in an ecosystem of wealth, infrastructure, know-how in a society, laws and the supply of energy. There are places where children are learning in a hi-tech environment and then there are millions of children who have not used computers at all. The likely life trajectories of the two groups couldn’t be more dissimilar. In short, a technology’s potential — will it serve the few or the majority? Will there be more flourishing or alienation? — is rooted in the ecosystem. Unless deliberately directed towards the common good, a powerful technology born in an unequal and warring society is more likely to exacerbate the problems than to mitigate them.
Hence, when thinking about the impact of a technology, considerations of efficiency — time and space saved, speed gained, cost reduced — are not enough. Rather, we must bring in its social and psychological impact, particularly when we are dealing with ubiquitous technologies such as TV, and, more recently, AI. What will be the effect on human labour and relationships? On wealth inequalities? On liberties? On citizens’ empowerment? On surveillance? On weapons and violence? On mental health? These are some of the questions that should be asked. This is why task forces and commissions created to assess technologies should include people from the humanities and social sciences.
Another strand in thinking about technology is to assess if it is limited to a particular field or has a broader scope, which impacts almost all aspects of life. The stethoscope was a revolutionary invention, but its impact was limited to medicine. It did not change the way we drive or cook or write. The internet, on the other hand, is a general-purpose technology that has reshaped almost everything. AI too is a general-purpose technology. The more omnipresent a technology, the more there is a need to grasp its impact on the human condition and not just a particular field. Currently, there is a tendency, across many fields, to think about AI’s use only in a particular field, for example, education, health or finance, without asking about the broader impact on society. This is like judging medicine only through its taste without asking for its impact on the body.
When thinking about the impact of technology, considerations of efficiency aren’t enough.
To all this, a new dimension has to be added when thinking about AI — the agentic capacity. AI is often seen as just a tool, but it is not. It has agentic capacities. It can respond and make decisions without immediate human input. From washing machines to atom bombs, no technology has decided anything on its own. It has been possible to predict each action. Not so with AI. That AI is agentic and not a passive tool is perhaps the most important feature to grasp when thinking about it. It essentially means that we have more or less created a new being, a new species, which, like other species, has the capacity to evolve and move in directions we cannot control. Such technologies need to be assessed by imagining their future capacities and not just in terms of their present performance.
Many of those who have thought hard about the issue have concluded that the optimal use of any technology requires regulation. Power, natural or mechanical, demands regulation to ensure it does not do colossal damage. A storm is the unregulated power of wind. Floods are the unregulated power of water. If AI is a very powerful technology, it requires regulation which should at least ban certain uses of it, such as in warfare and clandestine surveillance. Public democratic oversight and transparency of algorithms and moral specs are needed. Job losses should have consequences and harmful environmental impact must be proscribed.
The need for regulating AI is heightened by the fact that the ecosystem in which it is born is deeply concerning. Growing wealth inequalities, weakening democracies, increasing social fragmentation, rising autocracies, unchecked corporate power, a loneliness epidemic and mental illnesses all create an environment in which AI’s unregulated proliferation is likely to worsen matters.
Regulating AI won’t be easy. Critics often claim that such moves restrict innovation and the expansion of knowledge. This objection is not correct because it conflates technology with science. But, even the proponents can lose heart and think that it is too late to do anything; the genie is out of the bottle. To them we bring the words of David Graeber: “The ultimate, hidden truth of the world is that it is something that we make, and could just as easily make differently.” A bigger bottle of regulations might still keep the genie in check.
The writer is dean, Institute for Educational Development, Aga Khan University.
Published in Dawn, October 4th, 2025































