New Mexico Jury Awards $375 Million Against Meta in First State Consumer Protection Verdict
Verdict circumvents Section 230 immunity by targeting platform design rather than content, establishing a litigation blueprint for 42 remaining state cases.
A New Mexico jury awarded the state $375 million on March 24, 2026, finding Meta liable for violating consumer protection laws through deceptive practices around child safety—the first successful state-level jury verdict establishing that platform design choices fall outside federal immunity protections.
The verdict marks a strategic inflection point in Social Media Regulation. By framing claims around Meta’s own safety representations and algorithmic design rather than third-party content moderation, New Mexico Attorney General Raúl Torrez circumvented Section 230 of the Communications Decency Act—the federal statute that has shielded platforms from liability for user-generated content since 1996. The jury deliberated for one day after a seven-week trial, awarding the maximum penalty of $5,000 per violation across 37,500 affected users, according to Source New Mexico.
The Section 230 Workaround
New Mexico’s legal strategy avoided content moderation claims entirely. The state argued Meta violated the Unfair Practices Act through knowingly false safety disclosures, failing to disclose mental health impacts, and designing engagement algorithms that amplified harmful content to minors. Prosecutors documented these claims through a 2023 undercover operation where investigators created fake accounts posing as 13-year-olds and documented child sexual exploitation and predatory solicitations, per CNBC.
The distinction proved decisive. Federal courts have consistently ruled that Section 230 immunises platforms from liability for how they moderate or distribute third-party content. But as MultiState reported, claims focused on a platform’s own deceptive statements and design practices fall outside that protection. The jury found Meta executives knew their products harmed children, disregarded internal warnings, and misrepresented platform safety to users and regulators.
“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and Child Safety experts in saying enough is enough.”
— Raúl Torrez, New Mexico Attorney General
Meta’s defence centred on disclosures and enforcement efforts. “What the evidence shows is Meta’s robust disclosures and tireless efforts to prevent harmful content,” argued Meta attorney Kevin Huff during closing statements, according to Reuters. The jury rejected that framing, determining the company engaged in “unconscionable” trade practices.
Cascading Litigation Exposure
The New Mexico verdict is the first of 43 state attorney general lawsuits to reach a jury decision. As of March 2026, Meta faces thousands of individual lawsuits, roughly 1,700 school district complaints alleging platform-driven mental health harms, and the multi-state litigation wave, reported Live Insurance News. The Consumer Protection approach tested in New Mexico now provides a replicable template for prosecutors in remaining jurisdictions.
The financial exposure compounds beyond jury awards. A Delaware court ruled on February 27, 2026 that more than 20 insurers—including Hartford and Chubb—have no duty to defend Meta in the broader social media addiction litigation, treating deliberate design choices as intentional acts excluded from coverage. That decision, detailed by Insurance Journal, eliminates a key cost buffer and forces Meta to self-fund legal defence across thousands of cases.
Operational Implications
Meta announced plans to appeal the verdict, stating it “respectfully disagree[s]” with the jury’s findings, according to NBC News. But the company is already adjusting platform features in response to litigation pressure. Mid-trial, Meta removed end-to-end encryption options from Instagram direct messages, citing low user adoption—a decision that doubles as a transparency concession to prosecutors concerned about predator communications.
The verdict also tees up a May 4 bench trial where New Mexico will seek court-mandated platform changes including mandatory age verification, enhanced predator removal protocols, and restrictions on algorithmic content distribution to minors. Unlike the jury phase, which addressed past violations and damages, the upcoming proceedings could impose structural operating requirements that alter Meta’s engagement mechanics across all state markets.
Section 230 of the Communications Decency Act (1996) immunises online platforms from liability for third-party content. Courts have historically interpreted this broadly, dismissing claims that platforms should be liable for how they display, recommend, or moderate user posts. The New Mexico strategy bypasses this by targeting platform design features—infinite scroll, engagement algorithms, notification patterns—as inherently deceptive products rather than content curation tools. Harvard Gazette legal analysis suggests this distinction may survive appellate review because it regulates Meta’s conduct, not users’ speech.
What to Watch
Meta’s appeal will test whether state consumer protection statutes can regulate platform design without violating federal preemption or First Amendment protections for algorithmic curation. A ruling upholding the verdict would greenlight the 42 pending state cases and likely trigger settlement negotiations to cap total exposure. Meanwhile, the May bench trial will establish whether courts can impose prospective operational changes—age gates, content restrictions, algorithmic transparency—without triggering constitutional challenges. If sustained, the New Mexico precedent transforms social media litigation from a content moderation question into a product liability framework, where engagement-maximizing design itself becomes the actionable harm.
For Meta, the immediate financial hit is manageable—the company generated $53.7 billion in Q4 2025 revenue. But the litigation vector is not. Forty-three state attorneys general now have a jury-validated playbook that sidesteps Section 230 and converts platform economics into consumer fraud claims. The question is no longer whether states can regulate social media design, but how much it will cost.