Landmark Trials Target Big Tech’s Impact on Children’s Health and Wellbeing
Theodora Scarato also posted this on her Substack, found here.
As of early 2026, landmark lawsuits are being heard in U.S. courts across the country that seek to hold major tech companies accountable for harms to children allegedly caused by their social media platforms (depression, eating disorders, suicide, sexual exploitation, etc). These cases are being compared to Big Tobacco litigation, and the outcome could reshape how social media is regulated.
Who is being sued
- Meta (Instagram & Facebook)
- Google (YouTube)
- TikTok and Snap, which have already settled in some cases.
What the allegations claim
Plaintiffs, including hundreds of families and school districts, argue that the digital platforms were specifically designed to be addictive and that this intentional design has resulted in a youth mental health crisis, injury, and deaths.
The complaints include claims of:
- Algorithmic features like infinite scrolling, autoplay, and recommendation feeds that maximize engagement
- Use of notifications and reward mechanics that exploit young users’ developing brains
- Resulting harms such as anxiety, depression, body image disorders, self-harm, and suicidality among children.
These lawsuits characterize the platforms’ design choices as defective products, similar to claims made in historic litigation against tobacco companies.
Lawsuits claim that social media use contributed to self-harm and suicide.
Several lawsuits involve the deaths of teenagers whose families allege that intensive social media use contributed to self-harm or suicide. Plaintiffs claim the teens were exposed to repeated algorithm-driven content related to depression, self-harm, or suicide, and that the material presented increased once the child interacted with similar content. They also argue that design features such as infinite scroll, autoplay, and notifications prolonged exposure and reinforced compulsive use. Families allege that the companies knew internally that their platforms could harm teen mental health, and yet they made the decision to design algorithms that prioritize engagement and advertising revenue over safety.
Families allege, for example:
- Instagram amplified eating-disorder and self-harm content.
- Snapchat’s design features contributed to compulsive use tied to emotional distress.
- TikTok’s algorithms pushed self-harm content to vulnerable minors.
- Tech companies failed to prevent contact with sexual predators.
In response, the tech companies counter that:
- There is no recognized medical diagnosis of social media addiction,
- Correlation is not causation. There is no clear scientific consensus that social media “causes” addiction.
- The problem is not the product design, but the content produced, which is protected under the First Amendment and Section 230. Section 230 shields and protects online platforms from being liable for the content generated by third parties.
- Responsibility for children’s use lies with parents and guardians, not platforms.
- Mental health outcomes are influenced by many complex factors, including family environment, preexisting conditions, and societal trends, and causation cannot be attributed solely to platform design.
Numerous efforts to educate on holding Big Tech accountable
In addition to lawsuits, numerous initiatives are educating the public about Big Tech harms, children, and screens. Some excellent sites include Fairplay, Design it for Us, the Heat Initiative, Scrolling to Death, Children and Screens and the Social Media Victims Law Center.
Watch this PSA on AI and Child Safety about young lives lost after interacting with AI chatbots.
Why are social media companies liable?
The Social Media Victims Law Center, a law firm representing more than 1,200 parties in social media addiction lawsuits, has an excellent webpage breaking the lawsuits down and explaining why they believe liability lies with the technology companies.
“Plaintiffs allege that the social media companies owed a heightened duty of care because the complaints involve minors. According to the complaint, the social media companies knew or should have known that their products could cause harm, yet they failed to mitigate the risk of harm or warn users about the risk. The complaints in the current lawsuits seek to hold social media companies liable on the basis on strict liability and negligence for the following:
- Algorithms that promote compulsive use
- Never-ending feeds
- Lack of warnings when users are signing up
- Lack of any method to monitor and self-restrict length and frequency of use
- Barriers to voluntarily deleting or deactivating accounts
- Lack of meaningful age verification processes
- Lack of effective parental controls or monitoring mechanisms
- Lack of labels on filtered images and videos
- Intrusive notification timing designed to lure users back to the platforms”
–Social Media Addiction Lawsuit – 2026 Update
The lawsuits moving forward
These lawsuits are being compared to Big Tobacco-era litigation, and favorable rulings could lead to landmark corporate accountability.
In Los Angeles County Superior Court, California, a bellwether case in front of a state court is moving forward in which plaintiffs argue that Instagram’s parent company Meta and Google’s YouTube deliberately designed their platforms to addict and harm children. They state that the algorithm-driven features, infinite scroll, autoplay, and push notifications were engineered to maximize engagement at the expense of youth mental health leading to depression, anxiety, self-harm, and suicidality. TikTok and Snap, which were originally named in the lawsuit, reportedly settled for undisclosed sums before trial.
Opening arguments began in early February 2026. As detailed in AP “Google, Meta, push back on addiction claims in landmark social media trial” the Plaintiff’s lawyer W. Mark Lanier stated the case will be as “easy as ABC” — which stands for “addicting the brains of children.” He said Meta and Google, “two of the richest corporations in history,” have “engineered addiction in children’s brains,” creating an addictive ‘digital casino.” In response, META attorney Paul Schmidt spent most of his opening focused on the plaintiff’s personal troubles including body image issues, bullying, and a volatile home life, particularly a troubled relationship with her mother. Schmidt presented 10,000 pages of medical records, saying that within all of those records, jurors would not see a “single example” of KGM being addicted to YouTube.
In the U.S. District Court for the Northern District of California, hundreds of related lawsuits have been consolidated into MDL No. 3047 – Social Media Adolescent Addiction/Personal Injury Products Liability Litigation. The plaintiff lawsuits, brought on behalf of children and adolescents, as well as the school district suits, are against five major social media platforms: Meta’s Facebook and Instagram, Google’s YouTube, ByteDance’s TikTok, and Snapchat. They argue that the companies defectively designed their platforms to create compulsive use in minors and failed to warn users of risks.
In fact, more than 40 state attorneys general have filed lawsuits alleging that Meta harmed youth mental health through addictive design choices on Instagram and Facebook. The states argue that Meta misled the public about risks to teens while prioritizing engagement and profit. As detailed in a KCRA 3 Report, the majority of state AG lawsuits are now consolidated in federal court in Oakland, California and in 2024, a judge ruled that these states could proceed with most of their claims, specifically those involving misleading statements made by Meta regarding the safety of its platforms. A major federal trial representing school districts is scheduled to begin in June 2026.
In the First Judicial District Court of New Mexico, the State of New Mexico filed a lawsuit alleging that Meta failed to adequately protect children from sexual exploitation. As detailed by AP News in “New Mexico lawsuit accuses Meta of failing to protect children from sexual exploitation online” prosecution attorney Donald Migliori, in his opening statement, said Meta misrepresented the safety of its platforms and engineered its algorithms to keep young people online while knowing that children were at risk of sexual exploitation on social media.
Here are few PSAs regarding social media and child sexual exploitation.
Watch the Senate Judiciary hearing on Online Child Sexual Exploitation from January 2024.
So what about the risk to children’s health from cell phone radiation?
Theodora Scarato MSW, Director of the Wireless and EMF Program at Environmental Health Sciences stated:
“Parents posit that companies have created a harmful product by design. It does not take a stretch to see how companies are also failing children when it comes to protecting children from the risk of cell phone radiation.
Just like Big Tobacco did, technology companies are using a similar playbook to deflect responsibility from their product. The current social media litigation centers on algorithms and failure to protect children, but it raises the question: what other health and safety risks are associated with prolonged cell phone and digital device use?
Scientific research into the health effects of chronic cell phone and cell tower radiation exposure is ever mounting. Safety is not assured, especially for children. Children absorb higher levels of cell phone and cell tower radiation into their brains and bodies compared to adults and they are far more vulnerable to the effects. Studies have reported increased cancer, genetic effects, the induction of oxidative stress, memory damage, and impacts to brain development and to the reproductive system.
We expect future lawsuits to expand beyond screen time and platform design to include injuries to children from wireless radiation exposure. It is only a matter of time.”
