THE recent surge in digital manipulation in Bangladesh has justifiably raised concerns about its likely impact on the upcoming national election. This threat, emerging in an already charged sociopolitical atmosphere, could prove more complex and consequential than anything witnessed in recent electoral cycles.
Speakers at a roundtable organised by this newspaper and the Tech Global Institute therefore emphasised the need for rigorous scrutiny and policy interventions, warning that, without these, the playing field could be tilted not only against certain political parties and actors but also against voters themselves.
The speakers — political leaders, digital experts, and civil society representatives — identified online misinformation, data misuse, and AI-driven propaganda as key challenges.
They also highlighted how defamation and religiously charged cases are increasingly being filed to intimidate opponents or silence dissent. Concerns were raised about the judicial system being pressured by fears of backlash, leading to the misuse of vague provisions under laws such as the Anti-Terrorism Act and the Penal Code.
This cascade of legal and digital manipulation indicates that electoral integrity cannot be protected by state institutions alone. Political parties and social media platforms must also be held accountable for failing to prevent abuse down the line. In this context, several speakers underscored the vulnerability of citizens’ data. With 183 institutions having access to the National ID database and a large pool of candidates likely to receive constituency voter lists, how do we ensure this access is not abused?
Clearly, safeguarding personal data is no longer merely a privacy concern; it is also a matter of electoral fairness. Without strict oversight, voter information could be exploited for targeted intimidation, profiling, or micro-manipulation. Compounding these risks is the spread of AI-generated content. Fabricated videos, synthetic images, and doctored materials are already circulating — often produced by politically aligned actors — with women candidates and minority communities particularly exposed.
In the absence of digital literacy and a strong fact-checking infrastructure, such attacks may not only damage reputations but also discourage participation, suppress votes, and inflame communal tensions.
The fact that Bangladesh has nearly 80 million Facebook users, along with a substantial presence on YouTube, gives the government some leverage to demand responsible behaviour from these platforms.
Yet, they continue to operate with minimal accountability. As one expert noted, the government should negotiate election-time protocols with social media platforms —protocols that many countries have already implemented. Their absence here represents a policy blind spot, much like the lack of a robust regulatory framework for online political campaigns.
These gaps must be addressed by the Election Commission. It must regulate online campaigning and coordinate with relevant state bodies, media organisations, fact-checkers, and global tech giants to establish clear rules, rapid-response mechanisms, and meaningful accountability. Without such measures, the election risks being compromised.
Published in Dawn, December 1st, 2025































