By Jeff Horwitz and Diana Novak Jones
NEW YORK, March 26 - Talk about regulating social media to make it safer for children often gravitates around the U.S. Capitol and the EU headquarters in Brussels. But following a New Mexico jury verdict, a district court in Santa Fe carries some heft, too.
On Tuesday, a jury found Meta liable for violating New Mexico’s consumer protection laws and endangering children by enabling child sexual exploitation on its platforms, imposing a $375 million penalty. The verdict sets up a second phase of the case in May, when Judge Bryan Biedscheid is scheduled to hold a bench trial on the state’s claims that Meta created a “public nuisance” that harmed residents’ health and safety. That proceeding could result in the court ordering changes to the design of Facebook, Instagram and other apps used by teenagers.
This power to compel product changes sets New Mexico’s case apart from thousands of private suits filed on behalf of plaintiffs alleging Meta’s services harmed them, like another landmark social media addiction case in Los Angeles that Meta and Google lost this week.
New Mexico's win also bolsters states seeking to assert themselves in compelling change at tech companies amid paralysis in Washington, including legislation requiring tougher age-checking measures and restricting algorithmic feeds for young people.
NEW MEXICO EYES OPTIONS FOR META CHANGES
In an interview, New Mexico Attorney General Raúl Torrez laid out a wide range of prospective changes to Meta’s products that the state may pursue. They include asking the court to restrict the types of content recommended to minors, to limit the frequency and timing of social media notifications prodding teenagers to log on, to restrict the “infinite scroll” of content for children and to tighten age verification procedures.
The state will also propose a plan to mitigate harm already done by Meta's products to New Mexico residents.
“It’s not out of the realm of possibility that we ask for and receive an even greater award" at the second stage of the trial than the first, Torrez said. “But my perspective has been to focus on the changes of the product itself.”
Torrez, a Democrat, said the state would likely ask Biedscheid to appoint an independent monitor or special master who would oversee Meta’s compliance with New Mexico consumer protection law over the course of years.
“I’m not sure at the initial stage we’re going to be articulating a super specific path in terms of what the court would do,” he said.
Attorneys general have increasingly turned to public nuisance law, a legal doctrine that allows governments to sue over conduct they say unreasonably interferes with public health or safety, to target industries accused of causing widespread social harm, including opioid manufacturers.
LIABILITY SHIELD IN FOCUS
Even if successful, New Mexico’s effort faces a long road. Meta spokesperson Andy Stone said the company would appeal the original jury verdict and that “we will continue to defend ourselves vigorously.”
The appeal is expected to raise questions about Section 230 of the Communications Decency Act, the federal law that has long shielded tech companies from liability over user‑generated content.
Stone noted that Meta has made numerous safety upgrades to its platforms since the suit was filed – some that overlap with features sought by Torrez. The company has launched dedicated accounts for teen users with notifications turned off by default at night, added age verification features and announced its intention to filter out age-inappropriate content.
Meta recently said it was removing end-to-end encryption from Instagram’s messaging feature. While Meta said it is removing encryption due to lack of use, the change was celebrated by child safety advocates.
The company indicated it would continue offering encrypted messaging on WhatsApp, without addressing its plans for Facebook’s Messenger.
Max Willens, an analyst for eMarketer, said he was skeptical New Mexico would be able to force changes to the content recommendation systems at the core of Facebook and Instagram.
“Algorithm modification is not a likely remedy, but it is among the list of possible changes that could be required,” he said. “The second phase of this trial may be more consequential to social media platforms than the first.”
Court-ordered relief is even more difficult to secure for individual plaintiffs, noted Matthew Bergman of the Social Media Victims Law Center, one of the attorneys representing the plaintiff in the Los Angeles case that alleged Meta, YouTube and other social media companies negligently designed their products in ways that harmed users’ mental health.
On Wednesday, a jury awarded the woman a $6 million combined judgment against Meta and Google, in what is widely regarded as a test case for thousands of cases alleging similar harm.
Torrez acknowledged that regulating global social media platforms’ approach to online youth via state courts was “probably not the most efficient” way of tackling social media product design, but said he did not want to “wait any longer for a system to deliver what it should have 15 years ago.”
He added while New Mexico’s case is focused on child predation and grooming, the dozens of state attorneys general pursuing cases against Meta for damaging youth mental health more broadly also aim to force changes to products. Since the verdict, Torrez said his office has fielded questions from other states and regulatory bodies overseas.
“I have an expectation that Meta is in for a wave of litigation,” he said. “I’ve been real clear with colleagues that they could set up undercover investigations on these platforms right now and yield the same results.”