For years, parents, teenagers, pediatricians, educators and whistleblowers have pushed the idea that social media is detrimental to young people’s mental health and can lead to addiction, eating disorders, sexual exploitation and suicide.

For the first time, juries in two states took their side.

In Los Angeles on Wednesday, a jury found both Meta and YouTube liable for harms to children using their services. In New Mexico, a jury determined that Meta knowingly harmed children’s mental health and concealed what it knew about child sexual exploitation on its platforms.

Tech watchdog groups, families and children’s advocates cheered the jury decisions.

“The era of Big Tech invincibility is over,” said Sacha Haworth, executive director of The Tech Oversight Project. “After years of gaslighting from companies like Google and Meta, new evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years.”

While it’s too soon to tell if this week’s outcomes will lead to fundamental changes in how social media platforms treat their young users, the dual verdicts signal a changing tide of public perception against tech companies that is likely to lead to more lawsuits and regulation. For years, they have argued that the harms their platforms cause to children are a mere byproduct, unintentional and inevitable consequences of broader societal issues or bad actors taking advantage of safeguards. They pushed against the notion that psychological harms could be the result of social media use and downplayed research that showed otherwise.

When asked about whether people tend to use a platform or product more if it’s addictive during his testimony in the Los Angeles trial, Meta CEO Mark Zuckerberg said “I’m not sure what to say to that. I don’t think that applies here.”

The verdicts show the public’s growing willingness to hold the companies responsible for harms and demand meaningful changes in how they operate. What’s not apparent, at least not yet, is whether the companies will take heed. Both Meta and Google said they disagree with the verdicts and are exploring legal options, including appeals.

Arturo BĂ©jar, a former Meta engineering director who raised alarms about Instagram’s harms inside the company for years before testifying in Congress in 2023, said jury trials “level the playing field” for these trillion-dollar companies. But he cautioned that it will take actual regulation to rein them in.

“One thing that I saw working inside the company that effectively led to behavior change was when an attorney general or the FTC stepped in and required things of the company,” he said. “Both New Mexico and Los Angeles and all the attorneys general that are part of this process have really an extraordinary opportunity and the ability to ask for meaningful change.”

While both cases focused on harms to children, there are key differences between the two. New Mexico’s lawsuit was filed by state Attorney General RaĂşl Torrez in 2023. State investigators built their case by posing as children on social media, then documenting sexual solicitations they received as well as Meta’s response. The jury was asked to determine if Meta violated New Mexico’s consumer protection law.

The Los Angeles case had a single plaintiff, who goes by the initials KGM, against Meta, Google’s YouTube, TikTok and Snap. TikTok and Snap settled before trial. The plaintiff in this case argued that the platform design features of the two remaining defendants, Meta and YouTube, were designed to be addictive, especially for young users. Because thousands of families have filed similar lawsuits, KGM and a handful of other plaintiffs have been selected for bellwether trials — essentially test cases for both sides to see how their arguments play out before a jury, eventually leading to a broader settlement reminiscent of the Big Tobacco and opioid trials.

By focusing on deliberate design choices and product liability, the lawsuits were able to sidestep Section 230, which generally exempts internet companies from liability for the material users post on their services. Past lawsuits, which have focused on how the platforms distributed content, often failed on these grounds.

“For the first time, courts have held social media platforms accountable for how their product design can harm users,” said Nikolas Guggenberger, an assistant professor of law at the University of Houston Law Center. “This is a new legal territory that could reshape an industry long shielded by Section 230. Platforms will have to rethink their focus on engagement at any cost, which has outlived itself.”

The final outcome of the cases could take years to resolve pending appeals and settlement agreements, but experts say the shift in the public’s sentiment and understanding of social media’s dangers is already happening. In a 2025 Pew Research Center poll, for instance, 48% of teens said social media harms people their age. In 2022, only 32% said the same.

Amid social media’s reckoning, however, artificial intelligence chatbots are emerging as the next frontier in the fight to make technology safer for young people.

“You can ban today’s harm, but how do you know what tomorrow is going to bring?” said Sarah Kreps, a professor and director of Cornell University’s Tech Policy Institute. Whether it’s another social media app, AI or some other new technology, she added, new things will crop up.

“And people will flock to those because where there’s demand you will see a supply come to meet that demand,” she said.

Join the Conversation

1 Comment

  1. If these two separate rulings aren’t common sense, then they should be. I’m a 71-year-old man. I used Twitter for a short time. I felt it pull me in to look at responses to my posts. I felt dejected when I received a particulary brutal response. I felt anger and it showed in my responses. I thought about it and would get angry when I was offline. Then I woke up and said to myself “I’m an old man! I’m retired and no longer need this aggravation,” so I quit Twitter before it became X. I’m not sorry one bit. I never used FaceBook or other social media except for LinkedIn when I was working, but I recall that just before I retired, I kept seeing posts promoting individual’s Christianity. I’m not Christian and I objected to this overt BS and I responded to several of those posts by reminding them that this was a business social space. That didn’t seem to have any effect, so I deleted my profile and never looked back. I’m active and socially engaged in accoustic (guitar) music, meetings, meeting other retired guys for coffee, and I take part in exercise classes. I’m just fine. But, my point is that kids think differently. Peer pressure is consuming at that age. If you were bullied by a somebody when I was a kid, at least you escaped it when you were with your family. Social media is there 24-7. Social media is designed to pull you in and keep feeding you rewards – likes or interesting posts. It also feeds disapproving mechanisms and these weigh on kids disproportionally. Kids have a much harder time doing what I did because socially they become outcasts if they can’t talk about posts. I’m no psychologist, teacher, or any other professional trained in working with kids, but I have a granddaughter and I read. Kids haven’t really changed. They still think peer’s opinions are of huge importance, but social media that is unfettered magnifies this influence and introduces kids to other dangers!

Leave a comment

Your email address will not be published. Required fields are marked *