Audio leaks purportedly from the Prime Minister’s Office have set tongues wagging across the political divide, and quite reasonably, raised questions on the state of security of the place that arguably determines the country’s fate.

The recordings, which contain the voices of what sounds very much like high-ranking current and former government officials, are seen by many as a peek into the political machinations that take place away from the public eye.

The PML-N has yet to deny the audios, instead terming it a “serious lapse”, but the most recent recordings, which feature the alleged voices of Imran Khan, Asad Umar and Shireen Mazari, have been dismissed as a PML-N-developed fake by the PTI chief and “cut and paste” by Mazari.

So could the audios be deepfake?

“There is always a possibility,” said Rafay Baloch, a cybersecurity researcher and white hat hacker, adding that it could be determined through digital forensics.

“However, it hasn’t reached a point where deepfake audios in all languages can be produced without a margin of error,” he told Dawn.com.

“Hence, all the famous deepfake videos you see will have a voice actor mimicking the voice of the individual being impersonated.”

In the video above, Dawn.com takes a deep dive into what deepfakes are, their potential for danger and how they can be combated.

So what is a deepfake, anyway?

A combination of “deep learning” and “fake”, deepfakes are hyper-realistic videos digitally manipulated to show people saying and doing things that they never actually did.

They are difficult to detect — as they use real footage — can have authentic-sounding audio, and are optimised to spread quickly on social media. And it’s easy for viewers to assume that the video they’re watching is genuine.

And how dangerous is it?

Deepfake voices have been used by criminals imitating executives to dupe employees into transferring money to them.

A study describes it as a “major threat to our society”, warning that, “various political players, including political agitators, hacktivists, terrorists, and foreign states can use deepfakes in disinformation campaigns to manipulate public opinion and undermine confidence in a given country’s institutions”.

Another threat comes from the potential to harass and blackmail women from this tech, as the likenesses of women are frequently used in such videos. As of last year, there were 85,000 deepfakes circulating online, 90 per cent of which depict non-consensual porn featuring women.

But there are also legitimate uses of deepfaking tech. They’re used in educational media and digital communications, games and entertainment, social and healthcare, material science, and various business fields such as fashion and personalised e-commerce.

More than anything, a healthy level of scepticism can help. Cyberattack investigator Giacopuzzi says his work has ultimately left him convinced that in today’s world, “we need to question everything”.

It’s good advice. We should question everything, particularly when it comes from social media.

Opinion

Editorial

Delay in the offing?
Updated 03 Feb, 2023

Delay in the offing?

Govt must realise that political stability in the country cannot be achieved by extra-constitutional actions.
Divisions in PML-N
03 Feb, 2023

Divisions in PML-N

DISCORD and drama in PML-N ranks escalated this week when Shahid Khaqan Abbasi revealed he no longer holds a party...
Wikipedia ‘downgrade’
03 Feb, 2023

Wikipedia ‘downgrade’

ATTEMPTS to police the internet by states, often by giving opaque justifications for the action, are never a good...
Mianwali raid
Updated 02 Feb, 2023

Mianwali raid

The military needs to share intelligence with civilian agencies to neutralise the militant menace nationwide.
Corruption unlimited
02 Feb, 2023

Corruption unlimited

PAKISTAN’S consistent slide on Transparency International’s Corruption Perceptions Index over the last several...
Women police officers
02 Feb, 2023

Women police officers

IN a heartening development, a second female police officer has been appointed as DPO in Attock, weeks after the...