Dance videos of Modi, rival turn up AI heat in Indian election

Published May 16, 2024
India’s Prime Minister and leader of the ruling Bharatiya Janata Party Narendra Modi (C) with chief minister of Maharashtra state Eknath Shinde (L) and their deputy chief Minister Devendra Fadnavis (R) waves to the crowd during his roadshow in Mumbai, India on May 15. — AFP
India’s Prime Minister and leader of the ruling Bharatiya Janata Party Narendra Modi (C) with chief minister of Maharashtra state Eknath Shinde (L) and their deputy chief Minister Devendra Fadnavis (R) waves to the crowd during his roadshow in Mumbai, India on May 15. — AFP

An AI video shows an ecstatic Narendra Modi sporting a trendy jacket and trousers, grooving on a stage to a Bollywood song as the crowd cheers. The Indian prime minister reshared the video on X, saying “such creativity in peak poll season is truly a delight.”

Another video, with the same stage setting, shows Modi’s rival Mamata Banerjee dancing in a saree-like outfit, but the background score is parts of her speech criticising those who quit her party to join Modi’s. State police have launched an investigation saying the video can “affect law and order”.

The different reactions to videos created using artificial intelligence (AI) tools underscore how the use and abuse of the technology is increasing and creating worries for regulators and security officials as the world’s most populous nation holds a mammoth general election.

Easy to make AI videos, which contain near-perfect shadow and hand movements, can at times mislead even digitally-literate people. But risks are higher in a country where many of the 1.4 billion people are tech challenged and where manipulated content can easily stir sectarian tensions, especially at election time.

According to a World Economic Forum survey published in January, the risk to India from misinformation is seen higher than the risk from infectious diseases or illicit economic activity in the next two years.

“India is already at a great risk of misinformation — with AI in picture, it can spread at the speed of 100x,” said New Delhi-based consultant Sagar Vishnoi, who is advising some political parties on AI use in India’s election.

“Elderly people, often not a tech savvy group, increasingly fall for fake narratives aided by AI videos. This could have serious consequences like triggering hatred against a community, caste or religion.”

The 2024 national election being held over six weeks and ending on June 1 is the first in which AI is being deployed. Initial examples were innocent, restricted to some politicians using the technology to create videos and audio to personalise their campaigns.

But major cases of misuse hit the headlines in April, including deepfakes of Bollywood actors criticising Modi and fake clips involving two of Modi’s top aides that led to the arrest of nine people.

Difficult to counter

India’s Election Commission last week warned political parties against AI use to spread misinformation and shared seven provisions of information technology and other laws that attract jail terms of up to three years for offences including forgery, promoting rumours and enmity.

A senior national security official in New Delhi said authorities are concerned about the possibility of fake news leading to unrest. The easy availability of AI tools makes it possible to manufacture such fake news, especially during elections, and it’s difficult to counter, the official said.

“We don’t have an (adequate monitoring) capacity … the ever-evolving AI environment is difficult to keep track of,” said the official.

A senior election official said: “We aren’t able to fully monitor social media, forget about controlling content.”

They declined to be identified because they were not authorised to speak to the media.

AI and deepfakes are being increasingly used in elections elsewhere in the world, including in the US, Pakistan and Indonesia. The latest spread of the videos in India shows the challenges faced by authorities.

For years, an Indian IT ministry panel has been in place to order the blocking of content that it feels can harm public order, at its own discretion or upon receiving complaints. During this election, the poll watchdog and police across the nation deployed hundreds of officials to detect and seek removal of problematic content.

While Modi’s reaction to his AI dancing video — “I also enjoyed seeing myself dance” — was light-hearted, the Kolkata city police in West Bengal state launched an investigation against X user, SoldierSaffron7, for sharing the Banerjee video.

Kolkata cyber crime officer, Dulal Saha Roy, shared a typed notice on X asking the user to delete the video or “be liable for strict penal action”.

“I am not deleting that, no matter what happens,” the user told Reuters via X direct messaging, declining to share their number or real name as they feared police action. “They can’t trace [me].”

Election officers told Reuters that authorities can only tell social media platforms to remove content and are left scrambling if the platforms say the posts don’t violate their internal policies.

Viggle

The Modi and Banerjee dancing videos, with 30 million and 1.1m views respectively on X, were created using a free website, Viggle. The site allows a photograph and a few basic prompts that are detailed in a tutorial to generate videos within minutes that show the person in the photograph dancing or making other real-life moves.

Viggle co-founder Hang Chu and Banerjee’s office did not respond to Reuters queries.

Other than the two dancing AI videos, one other 25-second Viggle video spreading online shows Banerjee appearing in front of a burning hospital and blowing it up using a remote.

It’s an AI-altered clip of a scene from the 2008 movie, The Dark Knight, that shows Batman’s foe, Joker, wreaking havoc. The video post has 420,000 views.

The West Bengal police believe it violates Indian IT laws, but X has not taken any action as it “strongly believes in defending and respecting the voice of our users”, according to an email notice sent by X to the user, which Reuters reviewed.

“They can’t do anything to me. I didn’t take that [notice] seriously,” the user told Reuters via X direct messaging.

Opinion

Editorial

Back in parliament
Updated 27 Jul, 2024

Back in parliament

It is ECP's responsibility to set right all the wrongs it committed in the Feb 8 general elections.
Brutal crime
27 Jul, 2024

Brutal crime

No effort has been made to even sensitise police to the gravity of crime involving sexual assaults, let alone train them to properly probe such cases.
Upholding rights
27 Jul, 2024

Upholding rights

Sanctity of rights bodies, such as the HRCP, should be inviolable in a civilised environment.
Judicial constraints
Updated 26 Jul, 2024

Judicial constraints

The fact that it is being prescribed by the legislature will be questioned, given the political context.
Macabre spectacle
26 Jul, 2024

Macabre spectacle

Israel knows that regardless of the party that wins the presidency, America’s ‘ironclad’ support for its genocidal endeavours will continue.