Meta Faces Fresh Backlash as Schoolgirl Photos Used in Threads Ads Spark Safety Concerns
September 20, 2025
Meta is under fresh pressure after Instagram users reported being served Threads promotions that featured images of schoolgirls as young as 13, raising questions not only about child safety but also about algorithmic bias in the company’s advertising systems.
A London father said he repeatedly received Threads adverts on his Instagram feed that used parents’ back-to-school photos of their daughters. No equivalent pictures of boys were shown, fuelling suspicion that Meta’s recommendation engine was skewed towards content that risked sexualising young girls.
Meta, valued at more than $2 trillion, insists the practice does not breach its policies. A spokesperson said the images were drawn from publicly shared accounts and were being surfaced as ‘recommendation tools’ to encourage Threads sign-ups. The company stressed that posts originating from teenage accounts were not included.
Critics argue that explanation misses the point. ‘If Meta’s system is only pushing out pictures of girls in school uniforms to adult men, that reveals algorithmic bias, and bias with dangerous consequences,’ said one UK digital safety expert.
Ofcom Under Pressure
The row comes just months after Ofcom introduced new codes under the Online Safety Act designed to prevent platforms from enabling grooming and other forms of exploitation.
Crossbench peer Beeban Kidron said Meta’s actions were ‘a new low even for a company that has consistently put growth above safety,’ calling on the regulator to investigate whether the system breaches the law’s illegal-harms provisions.
The rules require firms to ensure children’s profiles, images and connections are not surfaced in ways that can expose them to unknown adults. Campaigners say that by recycling parental posts for marketing to strangers, Meta risks flouting both the spirit and the letter of the regulations.
Business Model vs. Privacy
Beyond child-safety concerns, privacy advocates note that Meta’s business model relies on harvesting and amplifying user-generated content without meaningful consent. In this case, even accounts set to private on Instagram were cross-posting to Threads, making posts more widely available than parents realised.
‘This isn’t just about one algorithm gone wrong,’ said a technology researcher. ‘It is about a structural approach that treats people’s private moments as free ad inventory. That raises wider ethical and regulatory questions.’
A Pattern of Missteps
The controversy is the latest in a series of clashes between Meta and regulators over data use, advertising transparency and child protection. The company has previously faced fines in the UK and EU over failures to protect minors from harmful content and from exploitative recommendation systems.
For parents caught up in the incident, the explanations ring hollow. ‘Not for any money in the world would I have allowed my daughter’s photo in uniform to be used as marketing bait,’ one mother said.
As Meta doubles down on expanding Threads in competition with Elon Musk’s X, the episode has revived old debates: whether Silicon Valley’s algorithm-driven growth can ever be squared with the duty to protect children online, and whether UK regulators have the tools and the will to hold Big Tech to account.
Search
RECENT PRESS RELEASES
Related Post