Federal judge hints that Big Tech companies may have to face consumer allegations of mental health harm
A federal judge in California hinted Friday that Google, Meta, Snap and TikTok could very likely have to face allegations by consumers that the social media companies harmed young Americans' mental health with addictive features built into their respective platforms — and that Big Tech's signature

A federal judge in California suggested Friday that Google and Meta, Snap, and TikTok may very well have to deal with consumer allegations that social media companies harmed the mental health of young Americans through addictive features that were built into their platforms. Section 230 - Big Tech's trademark liability shield - might not be enough to dispel these claims.
The judge who is overseeing the litigation, which includes almost 200 individual lawsuits against social media companies, has repeatedly stated that tech companies could be held liable for the harms that algorithmic rabbit-holes or image filters that promote eating disorders can cause to America's kids.
If the claims are allowed to go forward, they could be a major blow to the tech sector, which is currently fighting a legal assault against their services based on mental health accusations. It could also be a turning-point in how courts interpret Section 230. This 1996 law exempts websites from a range of lawsuits that target their content moderation decision.
In the last week, dozensof states filed a federal lawsuit against Meta claiming that the company was aware of the harm caused by the design of their social media platforms. Eight other states have filed similar lawsuits in their state courts. Meta responded by saying it is committed to providing a safe online experience.
District Judge Yvonne Gonzalez Rogers of US District Court for Northern District of California addressed lawyers representing both consumer plaintiffs and tech companies. She said that she was not convinced by arguments stating that either all claims should be thrown away or none at all.
She expressed skepticism as well in response to the claims of industry lawyers claiming that tech companies do not have a legal obligation to make sure their platforms are safe and secure for children.
Gonzalez Rogers criticised the consumer plaintiffs' disorganized collection of claims, and criticized them for seeming to complain about the content on social media platforms rather than focusing on design decisions that provide that content to the users.
She said that the burden of proof lies with the tech platforms in order to convince her to dismiss the cases early on.
In two exchanges, she emphasized the limitations of Section 230. In the first, she stated that there were'more objective functional decisions' than just simple content moderation decisions which would be protected under Section 230.
Gonzalez Rogers stated, "I don't think you can avoid that."
She then said that it was not clear to her that all of the claims were thrown out due to Section 231, implying some could be thrown away while others remain.
In the more than four-hour long hearing, attorneys argued over various legal theories. It is possible that Gonzalez Rogers will dismiss some claims for reasons other than Section. 230.
Gonzalez Rogers did say one thing, however: "Your billing fees exceed my annual salary."