The design and policy choices of X created fertile ground for inflammatory and racist narratives targeting Muslims and migrants following last year’s deadly Southport attack in the United Kingdom, a new analysis showed on Wednesday.

According to research published by Amnesty International, social media platform X played a “central role” in the spread of false narratives and harmful content, which contributed to riots against Muslim and migrant communities in Britain.

The technical analysis of X’s open-source code or publicly available software showed that its recommender system, also known as content-ranking algorithms, “systematically prioritises” content that sparks outrage, provokes heated exchanges, reactions and engagement, without adequate safeguards to prevent or mitigate harm.

“Our analysis shows that X’s algorithmic design and policy choices contributed to heightened risks amid a wave of anti-Muslim and anti-migrant violence observed in several locations across the UK last year, and which continues to present a serious human rights risk today,” said Pat de Brun, head of Big Tech Accountability at Amnesty International.

Far-right riots broke out across the UK following the stabbing attack by Axel Rudakubana in Southport on July 29 last year. The violence was fuelled by false online claims that the suspect, who is a British citizen born in Cardiff, Wales, was a Muslim asylum seeker.

Amid false claims circulating on social media platforms, many mosques, Islamic buildings, and hotels housing migrants were targeted across the country.

Algorithm appears to have no mechanism for assessing potential for causing harm

According to the research, as long as a post drives engagement, the algorithm appears to have no mechanism for assessing the potential for causing harm, “at least not until enough users themselves report it”.

“These design features provided fertile ground for inflammatory racist narratives to thrive on X in the wake of the Southport attack,” it added.

The study also noted that an account on X called ‘Europe Invasion’, known to publish anti-immigrant and Islamophobic content, posted shortly after news of the attack emerged that the suspect was “alleged to be a Muslim immigrant”.

It noted that the post garnered over four million views and within 24 hours, all X posts speculating that the perpetrator was Muslim, a refugee, a foreign national, or arrived by boat, were tracked to have an estimated 27m impressions.

Saying that the Southport tragedy occurred in the context of “major policy and personnel changes” at X, the study pointed out that since Elon Musk’s takeover in late 2022, X has laid off content moderation staff, reinstated previously banned accounts, disbanded Twitter’s Trust and Safety Advisory Council, and fired trust and safety engineers.

Numerous accounts that had been previously banned for hate or harassment, including that of Stephen Yaxley-Lennon, a far-right figure better known as Tommy Robinson, were also restored.

Opinion

Editorial

Water vision
01 May, 2026

Water vision

WATER insecurity in Pakistan has been building up for decades as per capita water availability has declined from...
Vaccine policy
01 May, 2026

Vaccine policy

PAKISTAN has finally approved its first National Vaccine Policy; a step the health ministry has rightly described as...
Labour rights
Updated 01 May, 2026

Labour rights

THE annual observance of May Day should move beyond statements about the state’s commitment to the rights of...
UAE’s Opec exit
Updated 30 Apr, 2026

UAE’s Opec exit

THE UAE’s exit from Opec is another sign of the major geopolitical shifts that are reshaping the global order. One...
Uncertain recovery
30 Apr, 2026

Uncertain recovery

PAKISTAN’S growth projections for the current fiscal present a cautiously hopeful picture, though geopolitical...
Police ‘encounters’
30 Apr, 2026

Police ‘encounters’

THE killing of nine suspects by Punjab’s Crime Control Department across Lahore, Sahiwal and Toba Tek Singh ...