Audio leaks purportedly from the Prime Minister’s Office have set tongues wagging across the political divide, and quite reasonably, raised questions on the state of security of the place that arguably determines the country’s fate.

The recordings, which contain the voices of what sounds very much like high-ranking current and former government officials, are seen by many as a peek into the political machinations that take place away from the public eye.

The PML-N has yet to deny the audios, instead terming it a “serious lapse”, but the most recent recordings, which feature the alleged voices of Imran Khan, Asad Umar and Shireen Mazari, have been dismissed as a PML-N-developed fake by the PTI chief and “cut and paste” by Mazari.

So could the audios be deepfake?

“There is always a possibility,” said Rafay Baloch, a cybersecurity researcher and white hat hacker, adding that it could be determined through digital forensics.

“However, it hasn’t reached a point where deepfake audios in all languages can be produced without a margin of error,” he told Dawn.com.

“Hence, all the famous deepfake videos you see will have a voice actor mimicking the voice of the individual being impersonated.”

In the video above, Dawn.com takes a deep dive into what deepfakes are, their potential for danger and how they can be combated.

So what is a deepfake, anyway?

A combination of “deep learning” and “fake”, deepfakes are hyper-realistic videos digitally manipulated to show people saying and doing things that they never actually did.

They are difficult to detect — as they use real footage — can have authentic-sounding audio, and are optimised to spread quickly on social media. And it’s easy for viewers to assume that the video they’re watching is genuine.

And how dangerous is it?

Deepfake voices have been used by criminals imitating executives to dupe employees into transferring money to them.

A study describes it as a “major threat to our society”, warning that, “various political players, including political agitators, hacktivists, terrorists, and foreign states can use deepfakes in disinformation campaigns to manipulate public opinion and undermine confidence in a given country’s institutions”.

Another threat comes from the potential to harass and blackmail women from this tech, as the likenesses of women are frequently used in such videos. As of last year, there were 85,000 deepfakes circulating online, 90 per cent of which depict non-consensual porn featuring women.

But there are also legitimate uses of deepfaking tech. They’re used in educational media and digital communications, games and entertainment, social and healthcare, material science, and various business fields such as fashion and personalised e-commerce.

More than anything, a healthy level of scepticism can help. Cyberattack investigator Giacopuzzi says his work has ultimately left him convinced that in today’s world, “we need to question everything”.

It’s good advice. We should question everything, particularly when it comes from social media.

Opinion

Editorial

Lakki police protest
12 Sep, 2024

Lakki police protest

Police personnel are on thed front line in the campaign against militancy, and their concerns cannot be dismissed.
Interwoven crises
12 Sep, 2024

Interwoven crises

THE 2024 World Risk Index paints a concerning picture for Pakistan, placing it among the top 10 countries most...
Saving lives
12 Sep, 2024

Saving lives

Access to ethical and properly trained mental health professionals must be made available to all.
Dark turn
Updated 11 Sep, 2024

Dark turn

What transpired in Islamabad should give at least the old guard within the more established political parties some pause.
Clearing the air
11 Sep, 2024

Clearing the air

THE rumour mill had been working overtime regarding a purported extension for the chief justice of the country....
Deplorable remarks
11 Sep, 2024

Deplorable remarks

It is a matter of grave concern that Imran Khan reportedly defended Gandapur’s hideous remarks about the Punjab CM and female journalists.