FTC Uses Meta Emails to Show Instagram’s Role in Enabling Online Predators
May 26, 2025
Newly unsealed documents presented during the Federal Trade Commission’s antitrust trial have revealed that Meta, the parent company of Instagram, was aware that its platform was exposing underage users to online predators—yet failed to take sufficient action to prevent it. The disturbing findings were reported by The New York Post, which examined the cache of internal emails, safety reports, and executive communications disclosed during the proceedings.
Among the most troubling revelations is a 2019 internal report titled “Inappropriate Interactions with Children on Instagram,” which indicated that nearly two million accounts belonging to minors were recommended to adult users flagged for grooming behavior over a three-month period. The report found that “27% of all follow recommendations to groomers were minors,” and added that the app was recommending minors to such accounts at nearly four times the rate of adults. Of those recommendations, 22% resulted in actual follow requests.
The data also revealed that Instagram received 3.7 million user reports of inappropriate comments during that same time frame—one-third of which came from minors. According to The New York Post, over half of these cases involved minors reporting inappropriate interactions with adults. Internal tests further demonstrated that Instagram’s algorithm directly recommended minors to accounts that had exhibited what the company itself described as “groomer-esque behavior.”
Related: Meta Concludes Defense in FTC Trial Over Instagram, WhatsApp Acquisitions
Despite these findings, Meta leadership reportedly resisted internal calls—led in part by Instagram co-founder Kevin Systrom—to commit more resources to safety efforts. The documents suggest that executives at Meta, including CEO Mark Zuckerberg, were aware of the risks but prioritized other initiatives instead of bolstering protective measures.
Per The New York Post, the FTC has argued that these documents support its broader claim that Facebook, now Meta, intentionally withheld crucial support and safety infrastructure from Instagram after acquiring the platform. The agency alleges that this strategy allowed Meta to consolidate its power while stifling competition and neglecting user welfare, particularly that of vulnerable youth.
A Meta spokesperson, addressing the 2019 report, stated, “This six-year-old research shows just one piece of the work to improve our child safety efforts, which we’ve significantly expanded since then.” The company emphasized its efforts to enhance detection tools and safety mechanisms, such as the recent rollout of Teen Accounts, which automatically apply stricter privacy and interaction settings for users under 18.
Still, the revelations cast a shadow over Meta’s commitment to user safety, especially regarding minors. The company insists that the term “groomers” in its internal documents referred to accounts that were ultimately removed for violating policies, not evidence that Instagram was intentionally connecting minors with predatory users.
Source: The New York Post
Search
RECENT PRESS RELEASES
Related Post