RECENTLY, in order to raise tax awareness, FBR and Nadra launched the tax-profiling system (TFP). According to its website, the “FBR Tax Profiling System allows the citizens to view their profile created by correlating their data from multiple data sources of assets, expenses, and lifestyles available with the Government of Pakistan”. It contains the data of 53 million citizens.
Accumulation of personal data online on a massive scale by the state while disregarding citizens’ privacy needs review. It can lead to the loss or manipulation of private information, and inequality. Digital space that disregards privacy casts an impression of a surveillance regime. The book Code Version 2.0 by Lawrence Lessig, a law professor at Harvard, provides some key insights.
According to Lessig, citizens’ behaviour can be regulated using four constraints, namely, laws, market, architecture, and social norms. Take robbery as an example: laws regulate this by penalising people who commit robbery. Market behaviour can be regulated through, for example, pricing mechanisms or by subsidising companies that create security products. As for architecture, communities and buildings are built following architectural guidelines that provide protection against robberies. Lastly, because of social norms, robbery is not considered respectable. Mayhem may ensue if these constraints, in fair proportion, are not in place.
These constraints can be applied to see how behaviour around privacy is being regulated in our country. Let us start with laws. We do not have digital data protection laws in place. So, the right to digital privacy is not established. This incentivises individuals, companies, and governments to consider privacy as an afterthought. No enforced laws can lead to data theft, buying or selling of data, manipulation, and disregard of privacy policies.
TFP data can be misused to increase inequality.
Data theft and sale or purchase of data is problematic on several levels. Data stored by TFP is a goldmine. An exposed wallet is a salesman’s dream. So in the case of theft of a rich data set, companies can manipulate consumer behaviour.
In the absence of suitable laws, the data stored by TFP can be used to increase inequality. TFP information can be used to define citizens’ social hierarchy. Based on such information, certain services or transactions can be denied to some social classes while giving privileged access to others, thus further widening inequality.
Privacy is not regulated through the market, either. Pricing mechanisms and subsidies for companies that take digital privacy seriously are not being used. Privacy is not a concern for vendors competing in the market.
Social norms around privacy in our country leave much to be desired. This is observed daily on social media. No educational programmes around privacy are in place. Even computer science graduates, graduating from local institutes, are not technically well equipped with a strong knowledge of privacy.
Now to architecture. In the digital world, architecture is the code (software and hardware) used by the digital application. Therefore, this constraint raises the most pertinent privacy questions for us. It raises questions such as how the software has been written, reviewed, tested, hosted and deployed. What hardware is being used? How is personal data stored, where it is stored, how does it flow in the application, who has access to it, and is it encrypted or stored in plain text? What happens in case of a system breach? Did we conduct a security audit? What is being logged by the software? Much of this, we do not know with regard to the TFP.
Out of the four constraints, it appears that, right now, privacy is only being regulated using code — giving much regulatory power to the government. This gives an impression of a surveillance regime.
Our government must make use of the four constraints, with minimum costs to citizens. Policymakers should answer privacy questions to eliminate the impression of a surveillance regime. Otherwise, millions will remain vulnerable to theft, manipulation, inequality, and fear.
Regulations have been passed in other countries that establish the right to privacy, to access their personal data, to be forgotten (so they can ask vendors to remove their data), to appoint data protection officers (apparently Nadra does not have one), to educate citizens. Those countries regulate privacy through the fair use of laws, market, social norms and architecture. We should also do so before it is too late.
The writer is a freelance contributor based in Lahore.
Published in Dawn, June 30th, 2019