Shares details on "separate, unconnected" takedowns in Pakistan and India.
Facebook has removed 712 accounts and 390 pages in India and Pakistan because of “inauthentic behaviour” and spamming, it said on Monday.
The social media giant — which shared details on "four separate, distinct and unconnected" takedowns linked to both Pakistan and India — said it had removed the pages, accounts and groups set up by the networks "for violating Facebook's policies on coordinated inauthentic behaviour or spam".
"Today we removed 103 pages, groups and accounts for engaging in coordinated inauthentic behaviour on Facebook and Instagram as part of a network that originated in Pakistan," said a statement issued by Nathaniel Gleicher, the company’s head of Cybersecurity Policy, on the investigation.
"Although the people behind this activity attempted to conceal their identities, our investigation found that it was linked to employees of the ISPR of the Pakistani military," said the statement.
Dawn.com has reached out to ISPR for comment.
"The takedown is because there is this network of fake accounts that they are using to conceal their identity and make these pages look independent, when in fact they are not," Gleicher told Dawn.com. "These pages, groups and accounts represent themselves as independent but in fact, are part of a coordinated operation."
He added that Facebook could not say whether the activity was directed by the organisation or the employees were acting on their own.
"There were multiple employees engaged in this," he said, adding that Facebook is "highly confident" of the identity of the people involved.
Gleicher clarified that Facebook was removing accounts based on their behaviour, not the content they posted.
The investigation found that the network in Pakistan was spread across 24 pages on Facebook and Instagram, 57 Facebook accounts, seven Facebook groups, and 15 Instagram accounts.
"The individuals behind this activity used fake accounts to operate military fan pages; general Pakistani interest pages; Kashmir community pages; and hobby and news pages. They also frequently posted about local and political news including topics like the Indian government, political leaders, and military," the statement said, sharing the following details:
Among these pages were Pakistan Cyber Defence News, Kashmir News, Gilgit Baltistan Times, Kashmir for Kashmiris, Painter's Palette, and PakistaN Army — the BEST. The information regarding names of the pages was shared by Atlantic Council think tank's Digital Forensic Research (DFR) Lab, which went through the material taken down by Facebook from the Pakistani network.
While talking to Dawn.com, Gleicher did not specify the number of individuals identified as being part of the network, nor did he elaborate on how the links between them and the blocked pages and accounts were established.
"For security purposes we cannot get too specific about how we make these links," he said, because this sort of monitoring is an ongoing activity. "One of the ways we make these links is when we see someone operating one of these fake accounts, and then they log into their own account," he added.
"We do not generally inform the individuals involved but we are in touch with the policy makers [of the countries]," said Gleicher. When asked about which policy makers they had reached out to in Pakistan, he named the Prime Minister's Office and "social media adviser".
Ahead of the Indian Elections ─ starting April 11 ─ Facebook said it has removed 687 pages and accounts linked to India’s main opposition Congress party because of “coordinated inauthentic behaviour” on the social media platform.
Facebook said its investigation found that individuals used fake accounts and joined various groups to disseminate their content and increase engagement. Their posts included local news and criticism of political opponents such as Prime Minister Narendra Modi’s Bharatiya Janata Party (BJP), Facebook said.
"While the people behind this activity attempted to conceal their identities, our review found that it was connected to individuals associated with an INC (Indian National Congress) IT Cell," it said, sharing the following details:
Two of the samples shared by Facebook were of posts that criticised Modi’s initiatives and called for supporting the Congress party and its president, Rahul Gandhi.
Separately, Facebook removed 15 pages, groups and accounts for engaging in coordinated inauthentic behaviour on Facebook and Instagram in India.
"A small number of page admins and account owners used a combination of authentic and fake accounts to share their content across a variety of pages. They posted about local news and political events, including topics like the Indian government, the upcoming elections, the BJP and alleged misconduct of political opponents including the INC," said the statement. "Although the people behind this activity attempted to conceal their identities, our investigation found that this activity was linked to individuals associated with an Indian IT firm, Silver Touch."
Separately, Facebook said it had also removed another 227 pages and 94 accounts in India for violating its policies against spam and misrepresentation.
Taken from a video of Nathaniel Gleicher, Head of Cybersecurity Policy
At Facebook, we are working to route out all forms of abuse, including 'coordinated inauthentic behaviour'.
Coordinated inauthentic behaviour is when groups of pages or people work together to mislead others about who they are or what they are doing.
Coordinated inauthentic behaviour isn't unique to Facebook or social media. People have been working together to mislead others for centuries and they continue to do so.
When we take down one of these networks, it's because of their deceptive behaviour ─ it's not because of the content they are sharing.
The post themselves may not be false and may not go against our community standards. We might take a network down for making it look like it is being run from one part of the world, when in fact it is being run from another.
This could be done for ideological purposes or it could be financially motivated. For example, spammers might seek to convince people to click on a link, to visit their page or to read their post.
We go after this kind of behaviour in two ways: using people and using technology together.
First our experiment investigators look for and take down the most sophisticated networks manually. This is like looking for a needle in a very large haystack.
Second, we built technology to automatically detect and remove the most common threats. This is like trekking that haystack. A good example of this is how we automatically detect and stop millions of attempts to create fake accounts every day.
Our people detect and stop the most sophisticated bad actors and we improve our technology based on what we learn and then that technology stops the less sophisticated threats, helping our investigators focus on what matters the most.
That's how we're keeping coordinated inauthentic behaviour off Facebook: look for the needle, shrink the haystack.
Source: Facebook Newsroom