Rehashing GDPR

Published August 7, 2023
The writer is a lawyer.
The writer is a lawyer.

DESPITE numerous attempts to introduce a robust data protection law that values the privacy of individuals and encourages growth, the proposed law on data protection fails to align the expectations of persons providing data and persons processing/ controlling it.

There are many significant issues with the proposed law; for example, the requirement for data localisation, powers of the Commission for Data Protection, access to sensitive personal data to government on the pretext of national security, etc. But another problem is that it is modelled on the European data protection law, GDPR (General Data Protection Regulation).

Pakistan is not the only country that has used GDPR as the blueprint for its data protection laws, but it may well be one among those that will be adversely affected by it especially because of the impact on startups. In the EU, for example, there is research showing how GDPR has severely affected small companies due to surging surveillance and compliance costs and limitations that reduce the scalability of data-driven businesses.

The proposed law also offers rights similar to GDPR — rights to access data, correction, withdrawal of consent, etc. The problem lies in not what rights have been granted but their extent and consequences for non-compliance. The process for compliance is long and burdensome, and makes transactions between data subjects and data controllers unnecessarily expensive.

There are issues with the proposed data protection law.

Take the example of the right to correction. The right empowers a person to whom the data relates (data subject) to require the data processor/ controller to rectify personal data if it’s inaccurate, incomplete, misleading, or outdated. The proposed law does not make any distinction for the type of activity for which the data is required, nor does it specify what is personal data. Though the definition provides for the data to be able to identify a person, it is silent on how much data is needed for identifying the data subject or what is needed for it to meet the threshold to be categorised as personal data. Nor does it answer the question of what is complete data or misleading data. If you see an advertisement for chocolates based on your search from the previous night and you begin a diet the next morning, is it incomplete or misleading data in so far that it does not show your preference reflected through your choices made on a search engine? These questions, though basic, make the cracks in the law obvious.

Another example is processing the data of children. The law requires that data controller/ processor obtain the permission of the parent/ authorised guardian of any person below the age of 18. The debate about age limit aside, how would data controllers be expected to reach out to parents or confirm guardianship? In what cases would the costs outweigh the benefit of paternalistic surveillance? For instance, would it be feasible in the case of a 17-year-old signing up for an Instagram account?

These are only some issues with the structure of the law. The primary issue is that it fails to keep up with developments in the sector. The law should have regulated the mechanism for obtaining consent, given how the concept has evolved in tech space due to the use of dark patterns. Instead, it continues with archaic ways of taking consent and allows data subjects to opt out of rights provided their consent is “free, specific, informed, and unambiguous”. This was always the case under the laws of contract and the proposed law has not added anything new nor has is clarified how consent is to be obtained and considered ‘informed’. Would a lengthy document of legalese text meet the requirement or would a short concise statem­ent every step of the way be sufficient?

Dark patterns are user interfaces that manipulate users into selecting options that the data processor may want them to take, instead of what they would actually want by using heuristics (rule of thumb) or biases of individuals. The example of a dark pattern could be taking consent to track a person by repeatedly showing the same screen to tire a person into giving consent or by pre-selecting the option for a user that the controller would want them to make.

Many such practices have been seen as tricks to manipulate the way consent is obtained, and while this consent may satisfy the provisions of the law, it creates a divergence between actual preferences and the manipulated preferences of users, and initiate a cycle where the more data the controllers collect, the more exploitative they become.

The ethical implications of data exploitation aside, these implications of data protection should also be considered before a proposal, which does not address the concerns of data collection, storage, processing, etc, by companies and imposes a huge cost for needless compliances, becomes an act of parliament.

The writer is a lawyer.

samar.masood2@gmail.com

Published in Dawn, August 7th, 2023

Opinion

Editorial

Dangerous law
Updated 17 May, 2024

Dangerous law

It must remember that the same law can be weaponised against it one day, just as Peca was when the PTI took power.
Uncalled for pressure
17 May, 2024

Uncalled for pressure

THE recent press conferences by Senators Faisal Vawda and Talal Chaudhry, where they demanded evidence from judges...
KP tussle
17 May, 2024

KP tussle

THE growing war of words between KP Chief Minister Ali Amin Gandapur and Governor Faisal Karim Kundi is affecting...
Dubai properties
Updated 16 May, 2024

Dubai properties

It is hoped that any investigation that is conducted will be fair and that no wrongdoing will be excused.
In good faith
16 May, 2024

In good faith

THE ‘P’ in PTI might as well stand for perplexing. After a constant yo-yoing around holding talks, the PTI has...
CTDs’ shortcomings
16 May, 2024

CTDs’ shortcomings

WHILE threats from terrorist groups need to be countered on the battlefield through military means, long-term ...