Meta Didn’t React When Adults Contacted Underage Kids? | Meta Didn’t React When Adults Con

November 22, 2025

These days, there have been several scrutiny aspects with regard to leading social media platforms, and the kind of operations they indulge in.

By:    |   22 Nov 2025 11:03 PM IST

These days, there have been several scrutiny aspects with regard to leading social media platforms, and the kind of operations they indulge in. Now leading global Media company Time has published a shocking report on the operations of matter, which is the parent company for Facebook, WhatsApp, and others.

According to the brief, Meta was fully aware that its platforms were exposing minors to disturbing levels of risk, yet the company allegedly failed to take meaningful corrective action.

Internal findings highlighted that millions of adult strangers were reaching out to underage users across its apps, creating a serious safety concern that demanded stronger intervention.

The document also outlined how Meta’s own research suggested its products were worsening mental health problems among teenagers, especially in areas like body image, anxiety, and self worth.

Instead of implementing decisive measures, these issues reportedly continued with minimal improvement. The brief further stated that harmful content on topics involving eating disorders, self harm, and child sexual exploitation was repeatedly flagged but rarely removed at scale. Algorithms continued to recommend or surface such material, making young users even more vulnerable.

The concern was not only about the presence of this content but also about the company’s delayed response in fixing systemic moderation gaps.

Overall, the picture presented was of a platform that understood the dangers faced by minors but struggled or hesitated to enforce consistent protections. The brief suggested that this pattern reflected broader failures in safeguarding young users who relied on Meta’s services every day.

Tags:

 

Search

RECENT PRESS RELEASES