Audio leaks purportedly from the Prime Minister’s Office have set tongues wagging across the political divide, and quite reasonably, raised questions on the state of security of the place that arguably determines the country’s fate.

The recordings, which contain the voices of what sounds very much like high-ranking current and former government officials, are seen by many as a peek into the political machinations that take place away from the public eye.

The PML-N has yet to deny the audios, instead terming it a “serious lapse”, but the most recent recordings, which feature the alleged voices of Imran Khan, Asad Umar and Shireen Mazari, have been dismissed as a PML-N-developed fake by the PTI chief and “cut and paste” by Mazari.

So could the audios be deepfake?

“There is always a possibility,” said Rafay Baloch, a cybersecurity researcher and white hat hacker, adding that it could be determined through digital forensics.

“However, it hasn’t reached a point where deepfake audios in all languages can be produced without a margin of error,” he told Dawn.com.

“Hence, all the famous deepfake videos you see will have a voice actor mimicking the voice of the individual being impersonated.”

In the video above, Dawn.com takes a deep dive into what deepfakes are, their potential for danger and how they can be combated.

So what is a deepfake, anyway?

A combination of “deep learning” and “fake”, deepfakes are hyper-realistic videos digitally manipulated to show people saying and doing things that they never actually did.

They are difficult to detect — as they use real footage — can have authentic-sounding audio, and are optimised to spread quickly on social media. And it’s easy for viewers to assume that the video they’re watching is genuine.

And how dangerous is it?

Deepfake voices have been used by criminals imitating executives to dupe employees into transferring money to them.

A study describes it as a “major threat to our society”, warning that, “various political players, including political agitators, hacktivists, terrorists, and foreign states can use deepfakes in disinformation campaigns to manipulate public opinion and undermine confidence in a given country’s institutions”.

Another threat comes from the potential to harass and blackmail women from this tech, as the likenesses of women are frequently used in such videos. As of last year, there were 85,000 deepfakes circulating online, 90 per cent of which depict non-consensual porn featuring women.

But there are also legitimate uses of deepfaking tech. They’re used in educational media and digital communications, games and entertainment, social and healthcare, material science, and various business fields such as fashion and personalised e-commerce.

More than anything, a healthy level of scepticism can help. Cyberattack investigator Giacopuzzi says his work has ultimately left him convinced that in today’s world, “we need to question everything”.

It’s good advice. We should question everything, particularly when it comes from social media.

Opinion

Editorial

Little respite
03 Mar, 2024

Little respite

IS inflation on its way out? The Consumer Price Index showed that inflation dropped to 23.1pc in February from ...
More slaughter
Updated 03 Mar, 2024

More slaughter

Israel’s extremist leaders are on an apocalyptic mission to ethnically cleanse Gaza.
Without VCs
03 Mar, 2024

Without VCs

THE delay in appointing vice chancellors across Pakistan’s universities has mushroomed into a crisis, with one...
Urgent challenge
Updated 02 Mar, 2024

Urgent challenge

The incoming finance team will have to prioritise economic decisions over political considerations and personal whims.
Contempt ruling
02 Mar, 2024

Contempt ruling

AN Islamabad High Court decision penalising the city’s deputy commissioner, a senior superintendent of police and ...
Streets of death
02 Mar, 2024

Streets of death

A LIFE without a sense of permanence is one aspect of a human crisis as complex as homelessness. But the fact that...