Op-ed | A jury told Meta its platforms harm children. Here’s what New York parents need to know.

April 20, 2026

On March 25, a Santa Fe jury ordered Meta to pay $375 million after finding that the company behind Instagram and Facebook knowingly harmed children’s mental health and failed to stop child sexual exploitation on its platforms.

The jury found Meta liable on every single count—willfully engaging in unfair, deceptive, and unconscionable trade practices—and imposed the maximum penalty allowed under New Mexico law: $5,000 per violation, across 75,000 documented violations.

That verdict came just days after a separate jury in Los Angeles awarded $6 million to a young woman who proved that Meta and YouTube’s addictive platform designs caused her lasting psychological harm. TikTok and Snapchat, also named as defendants in that case, chose to settle before the trial even began.

These aren’t isolated cases. They are bellwether trials—the first cases to go to verdict in a massive wave of litigation that now includes more than 2,465 lawsuits consolidated in a federal multidistrict litigation (MDL 3047) before Judge Yvonne Gonzalez Rogers in Oakland, California. And the outcomes so far are sending a clear message: the era of Big Tech operating without accountability for what its products do to children is ending.

What the bellwether trials revealed

The term “bellwether” comes from the practice of putting a bell on the lead sheep in a flock. In litigation, bellwether trials are test cases selected from a larger group of lawsuits to gauge how juries respond to the evidence and arguments. The outcomes often set the tone for settlements and future trials across the entire MDL.

What came out in these trials was damning. In the New Mexico case, prosecutors presented internal Meta communications showing that employees discussed how CEO Mark Zuckerberg’s push for end-to-end encryption on Facebook Messenger would eliminate the company’s ability to report approximately 7.5 million instances of child sexual abuse material to law enforcement. The state’s investigation began when undercover agents created a fake profile for a 13-year-old girl and found that the account was, in the attorney general’s words, “simply inundated with images and targeted solicitations” from predators.

In the Los Angeles case, the plaintiff—now 20 years old—testified about how she became addicted to Instagram and YouTube as a child and developed severe depression, anxiety, and an eating disorder as a result. The jury found Meta 70 percent liable and Google’s YouTube 30 percent liable for her injuries. The fact that TikTok and Snapchat settled before trial rather than face a jury tells you everything about how strong the evidence is.

Why this matters for New York families

New York has been at the forefront of this fight. In 2024, Governor Hochul signed the SAFE for Kids Act (Stop Addictive Feeds Exploitation), which requires social media companies to disable algorithmically driven addictive feeds and nighttime notifications for users under 18 unless a parent specifically opts in. The companion New York Child Data Protection Act prohibits online platforms from collecting, using, sharing, or selling the personal data of anyone under 18 without informed consent.

In January 2026, Governor Hochul went further, proposing legislation that would expand age verification requirements to online gaming platforms, set privacy settings to their highest levels by default for minors, block strangers from contacting kids, and disable AI chatbots on social media platforms used by children. New York City itself has joined the federal MDL, filing suit against the major platforms in federal court.

These legislative moves reflect a growing legal consensus: social media platforms were deliberately designed to be addictive, and the companies that built them knew children were being harmed. The SAFE for Kids Act and the bellwether verdicts are two sides of the same coin—the law is catching up to what parents have known for years.

This is a personal injury case

As a personal injury attorney, what strikes me about these cases is how familiar the legal framework is. At its core, this litigation is built on the same principles that govern any products liability or negligence claim: a company designed a product, the product was defective or unreasonably dangerous, the company knew about the danger, and people were hurt.

Replace “social media algorithm” with “defective brake system” or “toxic building material,” and the structure is identical. The difference is that the product in question is in your child’s pocket right now, and the injuries—depression, anxiety, self-harm, eating disorders, suicidal ideation—are ones that many parents have been watching unfold in their own homes without knowing they had legal options.

The bellwether verdicts have changed that. Juries are now saying, on the record, that these platforms cause harm and that the companies behind them are financially responsible. With more than 2,400 cases still pending and additional bellwether trials scheduled for later this year—including a federal trial in June 2026 representing school districts—the litigation is only gaining momentum.

What parents can do right now

If your child has suffered mental health consequences that you believe are connected to social media use, there are concrete steps you can take.

First, document everything. Save screenshots of your child’s usage data, any communications with the platforms, and records of mental health treatment—therapy sessions, psychiatric evaluations, prescriptions, hospitalizations. This documentation matters just as much in a social media case as it does in a car accident or a slip-and-fall.

Second, talk to a personal injury attorney who understands this litigation. The MDL is complex, the deadlines vary, and the legal landscape is shifting fast. An experienced attorney can evaluate your child’s situation, explain whether a claim is viable, and walk you through the process. Most personal injury attorneys, including my firm, offer free consultations and work on contingency—meaning you pay nothing unless there’s a recovery.

Third, know your rights under New York law. The SAFE for Kids Act gives parents new tools to limit how platforms interact with their children. If a platform is violating those protections, that’s not just a regulatory issue—it could strengthen a personal injury claim.

The tide is turning

For years, social media companies hid behind Section 230 of the Communications Decency Act, arguing that they couldn’t be held liable for content posted by users. But these bellwether cases have cracked that shield wide open. Courts are increasingly ruling—and juries are increasingly agreeing—that the claims against these companies aren’t about user-generated content. They’re about how the products were designed. Addictive algorithms, autoplay features, dopamine-driven notification systems, and recommendation engines that push harmful content to children are product design choices, and the companies that made those choices are being held responsible.

The Massachusetts Supreme Judicial Court ruled just this month that Meta must face the state attorney general’s lawsuit—the first time a state high court has allowed these claims to proceed past Section 230 defenses. A phase-two hearing in the New Mexico case begins May 4, where a judge will decide whether Meta must fund public programs to address the harm it caused. And the federal bellwether trial this June will test whether school districts can recover the costs of dealing with a youth mental health crisis that social media companies helped create.

This is not a future issue. It is happening right now, in courtrooms across the country, and the results are vindicating what millions of parents already knew. If your family has been affected, the law may be on your side—and the window to act is open.

Mark Shirian, Esq. is the founder of Mark David Shirian PC, a New York City personal injury firm representing individuals and families harmed by negligence, defective products, and corporate misconduct. Mark’s practice spans premises liability, construction accidents, motor vehicle collisions, and emerging areas of personal injury law including social media harm litigation. He is committed to holding powerful institutions accountable on behalf of everyday New Yorkers.

  

Search

RECENT PRESS RELEASES