A historic lawsuit accusing several prominent social media companies of intentionally designing their platforms to addict children is set to head to trial in Los Angeles on Tuesday.

It marks the first in a series of high-profile trials against Meta, TikTok and YouTube that are poised to proceed in the coming months. Snap, the parent company of Snapchat, was also part of the case but has since reached a settlement.

Thousands of individuals, school districts and states have filed lawsuits against the social media giants that have since been consolidated into two massive cases — one in California state court and another in federal court.

The proceedings in Los Angeles Superior Court are the first to head to trial on one of several “bellwether” cases, which will serve as test trials for the broader litigation.

“The simple fact that a social media company is going to have to stand trial before a jury and account for its design decisions is unprecedented in American jurisprudence,” said Matthew Bergman, founding attorney of the Social Media Victims Law Center, which is representing numerous plaintiffs.

The cases mark a key test for social media companies, which have been shielded from many lawsuits by Section 230. The provision of the Communications Decency Act prevents online platforms from being held liable for user-generated content.

However, judges in both the federal and state cases have rejected efforts to dismiss the lawsuits over Section 230, finding the protections do not block challenges over the companies’ design choices.

Sign up for the Morning Report
The latest in politics and policy. Direct to your inbox.

Email address
By signing up, I agree to the Terms of Use, have reviewed the Privacy Policy, and to receive personalized offers and communications via email, on-site notifications, and targeted advertising using my email address from The Hill, Nexstar Media Inc., and its affiliates

“[The lawsuits are] trying to make the claim that this isn’t about the content that ultimately got delivered to the individuals or to the plaintiffs because that’s a fairly clear Section 230 question,” said Mary Anne Franks, a professor at George Washington University Law School.

“Their argument is, ‘No, the way that you have designed your tool, your product, is that you have accelerated or augmented the accessibility of those harmful things to children,’” she added.

Because of the sprawling nature of the cases, the judges have selected several individual lawsuits known as “bellwether” cases that will go to trial.

The case headed before a jury Tuesday stems from a complaint filed by K.G.M., a 20-year-old who began using social media as a kid and says she became addicted to the platforms. This resulted in or worsened her depression, anxiety, body dysmorphia and suicidality, explained Mariana McConnell, who serves as co-lead counsel for the plaintiffs.

The plaintiffs’ lawyers broadly allege that the social media companies have made “a studied effort to induce” kids into using their products to drive up advertising revenue despite research linking social media to youth mental health problems.

“Over the past decade, Defendants have relentlessly pursued a strategy of growth-at-all-costs, recklessly ignoring the impact of their products on children’s mental and physical health and well-being,” a central complaint alleges.

Kids and teens, they argue, are key to the firms’ business models, leading them to “deliberately” tweak the design of their products to ”exploit” the psychology of young users, whose brains are still developing.

The social media companies have vigorously pushed back against these allegations. Meta, the parent company of Instagram and Facebook, argues the lawsuits are attempting to oversimplify a complex issue and place the blame for teens’ mental health struggles “squarely on social media companies.”

“Despite this complexity, plaintiffs’ lawyers have selectively cited Meta’s internal documents to construct a misleading narrative, suggesting our platforms have harmed teens and that Meta has prioritized growth over their well-being,” the company wrote in a press release earlier this month. “These claims don’t reflect reality.”

YouTube, for its part, has sought to distinguish itself from the other firms, arguing it is a streaming platform rather than a social media platform at its core. It contends the site was built for people to be able to share videos for educational purposes, not for social interactions.

All three companies also point to the various kids’ safety features they have rolled out over the years in an effort to address concerns.

Meta launched Teen Accounts in 2024, limiting the content that teens can see and who they can interact with on Instagram, Facebook and Messenger. YouTube has created YouTube Kids as a standalone app for young users, as well as “supervised” experiences for pre-teens and teens on its traditional platform.

On TikTok, users under 18 years old and 16 years old face various restrictions, including limits on daily screentime and in-app messaging. Snapchat also announced several new tools for parents Thursday, including insights into screentime and who their teens are adding as friends, just days after revealing it had reached a settlement.

TikTok declined to comment.

Tuesday’s trial comes as concerns about social media have steadily increased over the years.

The first time that companies’ decision-making came under significant public scrutiny was 2021, when Facebook whistleblower Frances Haugen came forward with allegations that the social media giant knew Instagram had a negative impact on the mental health of teenage girls but prioritized profits over safety.

This has spurred efforts to pass kids’ online safety legislation and repeal Section 230, although such measures have repeatedly stalled in Congress.

They have gained new momentum in recent months, as the rise of AI chatbots has raised new concerns about how many of the same companies are developing these products and whether they are negatively impacting children.

“I think we’re at a moment in the larger public consciousness where there’s been a kind of breaking point or a sense of frustration and anger and maybe even rage at the tech industry, and so all of these things are kind of converging and will be a spectacle for that reason,” Franks said.

No matter how the litigation ultimately turns out, she suggested the tech industry may feel less “invincible” going forward.

“It’s one of the first times we’ve seen a serious challenge to this vast immunity and this deferential treatment that the industry has gotten,” she noted.

“It would be good if this was a moment where it was a sign of practices going forward,” she added. “What will industry do in light of the fact that litigation can be actually successfully brought and to hope that that encourages some real incentives on the part of the industry to act better.”