A New Mexico jury decided on Tuesday that Meta has been lying to parents about how safe its platforms are for children. The penalty: $375 million, the maximum allowed under state law. The real cost for Meta may be far higher.
This is the first time Meta has lost a jury trial over child safety. Not the first time it's been accused, not the first time a state attorney general has filed a complaint, but the first time twelve people heard the evidence and said: you did this, and you knew you were doing it. More than 40 states have similar lawsuits pending. Every one of them just got more dangerous for the company.
The verdict came after a six-week trial that featured undercover investigations, internal company documents, and testimony from Mark Zuckerberg himself. Here's what happened, what the jury saw, and why this is likely just the beginning.
The Undercover Operation
The case started not with lawyers but with cops. In 2023, investigators from New Mexico's attorney general office created decoy accounts on Facebook and Instagram. The accounts were designed to look like children under 14. No tricks, no hacks, just profiles with ages that should have been caught by the platform's own policies. Instagram's minimum age is 13. Facebook's is the same.
What happened next was the prosecution's most damning evidence. The accounts were quickly recommended adult content. They received sexually explicit messages. Multiple adults in New Mexico contacted the fake profiles seeking sex. Two men were arrested at a motel where they believed they'd be meeting a 12-year-old girl. Both face criminal charges that are still pending.
Attorney General Raúl Torrez called it proof that Meta "allowed predators unfettered access to underage users" and connected them with victims, "often leading to real-world abuse and human trafficking." The investigation wasn't a hypothetical scenario. It was a documented, real-time demonstration of what happens when a child creates an account on Meta's platforms.

What Meta Knew and When
The prosecution's second pillar was internal documentation. Over the course of the trial, state attorneys introduced company emails, memos, and reports showing that Meta employees had flagged child safety problems repeatedly over more than a decade.
One set of documents proved particularly damaging. When CEO Mark Zuckerberg announced in 2019 that Facebook Messenger would move to default end-to-end encryption, internal messages showed employees calculating the impact: the change would prevent Meta from detecting and reporting approximately 7.5 million instances of child sexual abuse material per year to law enforcement. Meta went ahead with the encryption rollout anyway.
Zuckerberg took the stand during the trial, a rarity for a Fortune 500 CEO in a state-level case. Under questioning, he acknowledged that enforcing Instagram's age restrictions is "very difficult." He downplayed the importance of teen users to Meta's revenue, though prosecutors presented data suggesting the opposite.
State Attorney Linda Singer summarized the prosecution's argument bluntly: "Over the course of a decade, Meta has failed over and over again to act honestly and transparently."
The jury agreed. They found Meta liable on all counts for "unfair and deceptive" and "unconscionable" trade practices under New Mexico's consumer protection statute. The $375 million penalty is the statutory maximum of $5,000 per violation. Prosecutors had originally sought $2.1 billion.

The Algorithm on Trial
Beyond the undercover operation and internal documents, the trial put Meta's design philosophy on display. Prosecutors argued that Meta built its platforms to maximize engagement at any cost, using infinite scroll, auto-play videos, and algorithmic recommendations that actively pushed harmful content toward young users.
This isn't an abstract complaint about tech design. The state presented evidence that Meta's recommendation algorithms connected accounts that appeared to belong to minors with accounts sharing sexually explicit content. The algorithm didn't distinguish between a 30-year-old looking for news and a 13-year-old who had just signed up. It optimized for engagement, and nothing else.
This design argument connects to a broader pattern of tech companies facing accountability for how their platforms behave. The question is no longer just whether platforms host harmful content. It's whether the systems that distribute content are themselves the problem. New Mexico's case argued yes, and the jury agreed.
Meta's defense attorney Kevin Huff pushed back throughout the trial, arguing that Meta maintains "extensive safeguards" and has demonstrated "robust disclosures and tireless efforts to prevent harmful content." The jury was not persuaded.
Why $375 Million Is Just the Beginning
Meta says it disagrees with the verdict and plans to appeal. That appeal could take years. But the real story isn't the appellate process. It's what comes next, both in New Mexico and across the country.
A second phase of the New Mexico trial begins on May 4. This time, a judge will hear arguments on public nuisance claims without a jury. If the state prevails, the court could order Meta to implement specific platform changes: effective age verification, proactive removal of predatory accounts, and modifications to how encrypted messaging works for accounts belonging to minors.
Those mandated changes would matter more than the money. Meta generated $164 billion in revenue last year. A $375 million fine is a rounding error, roughly what the company earns in 20 hours. Court-ordered changes to how the platform actually operates would hit the business model itself.
Then there's the cascade. More than 40 state attorneys general have pending lawsuits against Meta over child safety. Tuesday's verdict provides them with something invaluable: proof that a jury will hold Meta accountable. The legal theories, the evidence strategies, and the expert witnesses that worked in New Mexico are now a blueprint.
At the federal level, Congress has been debating children's online safety legislation for years without passing anything. The TikTok saga showed how slowly Washington moves on social media regulation. New Mexico's jury just accomplished in six weeks what Congress hasn't managed in six years. Whether that embarrasses federal legislators into action or simply confirms that state courts are the real venue for tech accountability remains to be seen, but the dynamic has shifted.
What Changes
The $375 million verdict is symbolically significant and financially trivial for Meta. The company will appeal, and the process could take years. None of this resolves quickly.
But the precedent is real. Before Tuesday, no jury had ever held Meta accountable for child safety failures. The company could credibly argue that accusations were just accusations, that no finder of fact had examined the evidence and ruled against them. That argument is gone now.
The second phase in May could prove even more consequential. If a judge orders platform changes, Meta would face a choice: comply with court-mandated safety modifications in New Mexico, or implement them everywhere. Companies typically choose consistency over a patchwork of state-specific rules. That means a single judge in Santa Fe could effectively set safety standards for billions of users worldwide.
For parents, the verdict is validation. For tech companies facing increasingly aggressive state regulators, it's a warning. And for the 40-plus states with their own cases still pending, it's an invitation to press harder.
Sources
- New Mexico just handed Meta its first courtroom defeat over child safety - TechCrunch
- Meta ordered to pay $375 million in New Mexico trial over child exploitation - NBC News
- US jury orders Meta to pay $375m for endangering children - Al Jazeera
- Jury finds Meta liable in case over child sexual exploitation on its platforms - CNN
