SANTA FE, N.M. – A significant ruling by a New Mexico jury has determined that Meta, the parent company of popular social media platforms like Instagram and Facebook, knowingly harmed the mental health of children while concealing information regarding child sexual exploitation on its platforms.
This landmark decision follows a nearly seven-week trial, coinciding with ongoing deliberations in a federal court in California regarding the liability of Meta and YouTube in a similar case.
Jurors sided with state prosecutors who contended that Meta prioritized profits over the safety of young users. The jury found that Meta had violated sections of New Mexico’s Unfair Practices Act by hiding crucial information about the dangers of child sexual exploitation and its effects on children’s mental health.
The jury agreed with claims that Meta made false or misleading statements and engaged in “unconscionable” trade practices, which exploited the vulnerabilities and inexperience of children. They determined that there were thousands of violations, each contributing to a penalty amounting to $375 million.
In response to the verdict, a spokesperson for Meta expressed disagreement and announced plans to appeal. “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” they stated, emphasizing the company’s commitment to protecting teens online.
Meta’s attorneys argued that the company is transparent about risks and actively works to remove harmful content, although they acknowledged that some inappropriate material may slip through their filters.
The New Mexico case marks a pivotal moment in a wave of lawsuits against social media platforms concerning their impacts on children. This trial commenced on February 9 and is part of a larger movement among school districts and legislators seeking stricter regulations on smartphone use in educational settings.
Over 40 state attorneys general have filed lawsuits against Meta, alleging that the company contributes to a mental health crisis among young people by deliberately designing addictive features on Instagram and Facebook.
The New Mexico lawsuit was based on an undercover investigation where agents created social media accounts posing as children to document sexual solicitations and Meta’s responses.
Filed in 2023 by New Mexico Attorney General Raúl Torrez, the lawsuit also claims that Meta has not adequately disclosed or addressed the dangers of social media addiction. While Meta does not acknowledge the existence of social media addiction, executives have referred to “problematic use” and state that they want users to enjoy their time on the platforms.
“Evidence shows not only that Meta invests in safety because it’s the right thing to do but because it is good for business,” argued Meta attorney Kevin Huff during closing statements. He emphasized that Meta designs its apps to help people connect with friends and family, rather than to facilitate predatory behavior.
Historically, tech companies have enjoyed protections from liability for user-generated content under Section 230 of the U.S. Communications Decency Act, as well as First Amendment protections. However, New Mexico prosecutors assert that Meta should still be held accountable for disseminating harmful content through its complex algorithms.
Prosecution attorney Linda Singer remarked, “We know the output is meant to be engagement and time spent for kids. That choice that Meta made has profound negative impacts on kids.”
A second phase of the trial is expected to occur, potentially in May, where a judge without a jury will determine whether Meta created a public nuisance and may be ordered to make changes and pay for remedies.
Throughout the trial, jurors examined a wealth of Meta’s internal communications and safety reports. They also heard from Meta executives, engineers, whistleblowers, psychiatric experts, and tech-safety consultants.
Local educators testified about the challenges they face due to social media-related disruptions, including sextortion schemes targeting children.
“This case is about one of the biggest tech companies in the world taking advantage of New Mexico teens,” stated Chief Deputy Attorney General James Grayson during closing arguments.
The jury, composed of residents from Santa Fe County, considered whether statements made by Meta CEO Mark Zuckerberg, Instagram head Adam Mosseri, and Meta global head of safety Antigone Davis misled users regarding platform safety.
During deliberations, the jury used a checklist of allegations that included Meta’s failure to disclose its knowledge about issues with enforcing the age restriction for users under 13, the prevalence of content related to teen suicide, and the role of algorithms in promoting sensational or harmful material.

