Business

New Mexico seeks child safety restrictions on Meta apps and algorithms in trial's 2nd phase

New Mexico Meta FILE - A recording of Meta Founder and CEO Mark Zuckerberg's deposition is played for the jurors on March 4, 2026, in Santa Fe, N.M. (Jim Weber/Santa Fe New Mexican via AP, Pool, File) (Jim Weber/AP)

SANTA FE, N.M. — New Mexico state prosecutors are seeking fundamental changes to Meta's social media apps and algorithms to safeguard children in the second phase of a landmark trial on allegations that platforms such as Instagram have created a public safety hazard.

Opening statements are scheduled Monday in the three-week bench trial to decide whether the platforms of Meta, which also owns Facebook and WhatsApp, pose a public nuisance under state law.

In the first phase, jurors ordered $375 million in civil penalties against Meta, determining that it knowingly harmed children's mental health and concealed what it knew about child sexual exploitation on its platforms.

Prosecutors are now asking a judge to impose fundamental changes aimed at reining in addictive features, improving age verification and preventing child sexual exploitation through default privacy settings and closer oversight.

Meta has vowed to appeal the jury verdict and warned that it could eliminate Instagram and Facebook service in New Mexico if forced to comply with impractical mandates.

“The fact that we’re having a trial on nuisance is itself a remarkable outcome,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University School of Law in California. “That theory is not well accepted as applied to the internet, and that theory doesn’t really fit the internet.”

Trial could alter algorithms that define social media

New Mexico Attorney General Raúl Torrez said the jury verdict punctured the aura of invincibility protecting tech companies from liability for material on their platforms under Section 230, a 30-year-old provision of the U.S. Communications Decency Act.

A Los Angeles jury separately found both Meta and YouTube liable for harms to children, validating long-standing concerns about dangers of social media.

New Mexico prosecutors are demanding that Meta help remedy a mental health crisis among children through a series of safeguards and changes, including a redesign of algorithms that make content recommendations so they no longer prioritize constant engagement.

Prosecutors are also targeting other features linked to compulsive use such as "infinite scroll," which continuously loads content; push notifications; and default settings that show tallies for "likes" and sharing. Their lawsuit also seeks improvements to age verification and other steps aimed at curbing child sexual exploitation.

And New Mexico wants child accounts on Meta platforms to have an associated parent or guardian, as well as a court-supervised child safety monitor to track improvements over time.

Meta asserts free speech protections

Executives have said the company continuously improves child safety and addresses compulsive use and that many demands from prosecutors are redundant.

Meta plans to call an array of technical experts as witnesses in arguing that the demands are impractical if not impossible and would force it to “disregard the realities of the internet.”

The company also argues that its platforms are being singled out among hundreds of apps that teens use, leaving children vulnerable on platforms with less robust protections.

The company is invoking free speech protections that have shielded social media for decades.

“The state’s proposed mandates infringe on parental rights and stifle free expression for all New Mexicans,” Meta said last week in a statement.

Influence could be far-reaching

The case is the first to reach trial among lawsuits filed by more than 40 state attorneys general on allegations that Meta contributes to a youth mental health crisis. Most are pursuing remedies in U.S. federal court.

Torrez, the state attorney general, said that puts the case in a unique position not only “to try and change the paradigm of how this company does business, but also how Big Tech generally is expected to do business going forward.”

Goldman said prosecutors may be venturing into uncertain legal waters just in seeking age verification mandates.

“In practice a court order saying that Facebook had to impose age authentication would have no Supreme Court textual support,” he said. “The Supreme Court might bless it. We don’t know.”

The first phase of the trial saw six weeks of testimony from witnesses including teachers, psychiatric experts, state investigators, top Meta officials and whistleblowers who left the company.

Copyright 2026 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.