“I’M sorry for everything you have been through. No one should endure what your families have suffered, and this is why we will continue industry-wide efforts to make sure it never happens again,” said Mark Zuckerberg, standing before grieving parents in a Senate hearing room heavy with silent anger. The hearing, held by the Senate Judiciary Committee on online child exploitation, brought together tech CEOs and parents who had lost children to online abuse.
Zuckerberg’s apology became the defining moment of a session that forced the world to confront how social media has failed to protect its youngest users. The hearing came amid lawsuits and tragedies that revealed how social media platforms are being used to harm children. One case involved a teenager coerced on Instagram into sharing intimate images, later blackmailed, and driven to suicide. Stories like this are becoming increasingly common, and they show how child predators exploit the reach and anonymity of digital platforms to groom, manipulate, and destroy.
Online child grooming isn’t new, but now it seems to be turning into a pandemic. Initiated in the early chatrooms, sadly, now fuelled by algorithms designed to connect but indifferent to harm, it has spread across every major platform. In many cases, predators now find, target, and exploit minors faster than platforms can react. The result is a crisis hidden in plain sight and built into the very systems that reward engagement over safety.
A recent example, where Meta used photos of schoolgirls to promote its Threads app, has made it painfully clear. The pictures were taken from parents’ social media posts celebrating the first day of school, then shown to adult men without consent. What began as proud parents sharing photos of their daughters returning to school was quietly turned into marketing material. The images meant to capture pride and innocence were repurposed to sell an app. Parents called it deeply violating. Meta defended by pointing out that the images in question were ‘public’, and thus, fair game.
Social media giants do not set prices but control attention instead.
In addition to revealing the vacuum at the heart of its design, Meta’s response questions the trust that keeps people sharing their lives online. When private expressions become corporate assets, the boundary between community and exploitation disappears. Families who thought they were connecting with friends instead saw their children’s images transformed into advertising, viewed by strangers. This ‘betrayal’ corrodes the very idea of social connection.
The deeper issue is monopoly. Meta’s control across Facebook, Instagram, and Threads allows it to repurpose content across platforms without genuine consent. What is framed as ‘connectivity’ is, in truth, a system where data flows seamlessly between products, creating a surveillance web too vast to escape. This is precisely why regulation has failed. These companies aren’t just platforms; they are ecosystems, coded, opaque, and faster than any law.
Unlike traditional monopolies, social media giants do not set prices but control attention instead. They decide what billions see and believe while answering mainly to advertisers and investors rather than to the people whose lives and data keep their platforms alive. Unlike what many believe, boycotting the platforms doesn’t do much. When users have nowhere else to go, consent becomes meaningless. Walking away from the platform is not an act of freedom but a kind of exile from the digital world we’ve all been made to depend on.
This concentration of power also distorts accountability. Platforms insist they are neutral intermediaries, not publishers. But when one company owns the network, the ads, and the algorithms, neutrality becomes fiction. They curate what we read, reward what we share, and quietly profit from every emotional reaction. Regulation isn’t about policing speech but about restoring accountability. Yet in much of the world, ‘safety’ laws have become a pretext for censorship. Measures meant to protect users from harm often end up silencing critics and shrinking the space for free expression.
When governments pursue political interests instead of real digital safety, meaningful regulation is lost. Rather than demanding transparency, they legislate for control. Unsurprisingly, platforms play along, appeasing authorities through selective compliance while preserving the opaque systems that fuel their profits. The solution lies in balance: laws that demand transparency and limit exploitation without suffocating speech. Without proper checks, corporations reduce users to data, and regulation risks becoming another means of control rather than protection.
The Threads episode is a reminder that the internet only belongs to the public when power is shared back with the people who make it alive.
The writer is the founder of Media Matters for Democracy.
Published in Dawn, October 18th, 2025





























