Rehashing GDPR

Published August 7, 2023
The writer is a lawyer.
The writer is a lawyer.

DESPITE numerous attempts to introduce a robust data protection law that values the privacy of individuals and encourages growth, the proposed law on data protection fails to align the expectations of persons providing data and persons processing/ controlling it.

There are many significant issues with the proposed law; for example, the requirement for data localisation, powers of the Commission for Data Protection, access to sensitive personal data to government on the pretext of national security, etc. But another problem is that it is modelled on the European data protection law, GDPR (General Data Protection Regulation).

Pakistan is not the only country that has used GDPR as the blueprint for its data protection laws, but it may well be one among those that will be adversely affected by it especially because of the impact on startups. In the EU, for example, there is research showing how GDPR has severely affected small companies due to surging surveillance and compliance costs and limitations that reduce the scalability of data-driven businesses.

The proposed law also offers rights similar to GDPR — rights to access data, correction, withdrawal of consent, etc. The problem lies in not what rights have been granted but their extent and consequences for non-compliance. The process for compliance is long and burdensome, and makes transactions between data subjects and data controllers unnecessarily expensive.

There are issues with the proposed data protection law.

Take the example of the right to correction. The right empowers a person to whom the data relates (data subject) to require the data processor/ controller to rectify personal data if it’s inaccurate, incomplete, misleading, or outdated. The proposed law does not make any distinction for the type of activity for which the data is required, nor does it specify what is personal data. Though the definition provides for the data to be able to identify a person, it is silent on how much data is needed for identifying the data subject or what is needed for it to meet the threshold to be categorised as personal data. Nor does it answer the question of what is complete data or misleading data. If you see an advertisement for chocolates based on your search from the previous night and you begin a diet the next morning, is it incomplete or misleading data in so far that it does not show your preference reflected through your choices made on a search engine? These questions, though basic, make the cracks in the law obvious.

Another example is processing the data of children. The law requires that data controller/ processor obtain the permission of the parent/ authorised guardian of any person below the age of 18. The debate about age limit aside, how would data controllers be expected to reach out to parents or confirm guardianship? In what cases would the costs outweigh the benefit of paternalistic surveillance? For instance, would it be feasible in the case of a 17-year-old signing up for an Instagram account?

These are only some issues with the structure of the law. The primary issue is that it fails to keep up with developments in the sector. The law should have regulated the mechanism for obtaining consent, given how the concept has evolved in tech space due to the use of dark patterns. Instead, it continues with archaic ways of taking consent and allows data subjects to opt out of rights provided their consent is “free, specific, informed, and unambiguous”. This was always the case under the laws of contract and the proposed law has not added anything new nor has is clarified how consent is to be obtained and considered ‘informed’. Would a lengthy document of legalese text meet the requirement or would a short concise statem­ent every step of the way be sufficient?

Dark patterns are user interfaces that manipulate users into selecting options that the data processor may want them to take, instead of what they would actually want by using heuristics (rule of thumb) or biases of individuals. The example of a dark pattern could be taking consent to track a person by repeatedly showing the same screen to tire a person into giving consent or by pre-selecting the option for a user that the controller would want them to make.

Many such practices have been seen as tricks to manipulate the way consent is obtained, and while this consent may satisfy the provisions of the law, it creates a divergence between actual preferences and the manipulated preferences of users, and initiate a cycle where the more data the controllers collect, the more exploitative they become.

The ethical implications of data exploitation aside, these implications of data protection should also be considered before a proposal, which does not address the concerns of data collection, storage, processing, etc, by companies and imposes a huge cost for needless compliances, becomes an act of parliament.

The writer is a lawyer.

samar.masood2@gmail.com

Published in Dawn, August 7th, 2023

Opinion

Editorial

Narcotic darkness
08 May, 2024

Narcotic darkness

WE have plenty of smoke with fire. Citizens, particularly parents, caught in Pakistan’s grave drug problem are on...
Saudi delegation
08 May, 2024

Saudi delegation

PLANS to bring Saudi investment to Pakistan have clearly been put on the fast track. Over the past month, Prime...
Reserved seats
Updated 08 May, 2024

Reserved seats

The truth is that the entire process — from polls, announcement of results, formation of assemblies and elections to the Senate — has been mishandled.
Impending slaughter
Updated 07 May, 2024

Impending slaughter

Seven months into the slaughter, there are no signs of hope.
Wheat investigation
07 May, 2024

Wheat investigation

THE Shehbaz Sharif government is in a sort of Catch-22 situation regarding the alleged wheat import scandal. It is...
Naila’s feat
07 May, 2024

Naila’s feat

IN an inspirational message from the base camp of Nepal’s Mount Makalu, Pakistani mountaineer Naila Kiani stressed...