The Big Tech Social Media Addiction and Sexual Exploitation Lawsuits: What You Need to Know

The Big Tech Social Media Addiction and Sexual Exploitation Lawsuits: What You Need to Know

Landmark Trials Target Big Tech’s Impact on Children’s Health and Wellbeing

As of early 2026, landmark lawsuits are being heard in U.S. courts across the country that seek to hold major tech companies accountable for harms to children allegedly caused by their social media platforms (depression, eating disorders, suicide, sexual exploitation, etc). These cases are being compared to Big Tobacco litigation, and the outcome could reshape how social media is regulated.

Who is being sued

  • Meta (Instagram & Facebook)
  • Google (YouTube)
  • TikTok and Snap, which have already settled in some cases.

What the allegations claim

Plaintiffs, including hundreds of families and school districts, argue that the digital platforms were specifically designed to be addictive and that this intentional design has resulted in a youth mental health crisis, injury, and deaths.

The complaints include claims of:

  • Algorithmic features like infinite scrolling, autoplay, and recommendation feeds that maximize engagement
  • Use of notifications and reward mechanics that exploit young users’ developing brains
  • Resulting harms such as anxiety, depression, body image disorders, self-harm, and suicidality among children.

These lawsuits characterize the platforms’ design choices as defective products, similar to claims made in historic litigation against tobacco companies.

Lawsuits claim that social media use contributed to self-harm and suicide.

Families allege, for example:

  • Instagram amplified eating-disorder and self-harm content.
  • Snapchat’s design features contributed to compulsive use tied to emotional distress.
  • TikTok’s algorithms pushed self-harm content to vulnerable minors.
  • Tech companies failed to prevent contact with sexual predators.

In response, the tech companies counter that:

  • There is no recognized medical diagnosis of social media addiction,
  • Correlation is not causation. There is no clear scientific consensus that social media “causes” addiction.
  • The problem is not the product design, but the content produced, which is protected under the First Amendment and Section 230. Section 230 shields and protects online platforms from being liable for the content generated by third parties.
  • Responsibility for children’s use lies with parents and guardians, not platforms.
  • Mental health outcomes are influenced by many complex factors, including family environment, preexisting conditions, and societal trends, and causation cannot be attributed solely to platform design.

Numerous efforts to educate on holding Big Tech accountable

Watch this PSA on AI and Child Safety about young lives lost after interacting with AI chatbots.

Why are social media companies liable?

“Plaintiffs allege that the social media companies owed a heightened duty of care because the complaints involve minors. According to the complaint, the social media companies knew or should have known that their products could cause harm, yet they failed to mitigate the risk of harm or warn users about the risk. The complaints in the current lawsuits seek to hold social media companies liable on the basis on strict liability and negligence for the following:

  • Algorithms that promote compulsive use
  • Never-ending feeds
  • Lack of warnings when users are signing up
  • Lack of any method to monitor and self-restrict length and frequency of use
  • Barriers to voluntarily deleting or deactivating accounts
  • Lack of meaningful age verification processes
  • Lack of effective parental controls or monitoring mechanisms
  • Lack of labels on filtered images and videos
  • Intrusive notification timing designed to lure users back to the platforms”

The lawsuits moving forward

These lawsuits are being compared to Big Tobacco-era litigation, and favorable rulings could lead to landmark corporate accountability.

In Los Angeles County Superior Court, California, a bellwether case in front of a state court is moving forward in which plaintiffs argue that Instagram’s parent company Meta and Google’s YouTube deliberately designed their platforms to addict and harm children. They state that the algorithm-driven features, infinite scroll, autoplay, and push notifications were engineered to maximize engagement at the expense of youth mental health leading to depression, anxiety, self-harm, and suicidality. TikTok and Snap, which were originally named in the lawsuit, reportedly settled for undisclosed sums before trial.

Here are few PSAs regarding social media and child sexual exploitation.

Watch the Senate Judiciary hearing on Online Child Sexual Exploitation from January 2024.

So what about the risk to children’s health from cell phone radiation?

“Parents posit that companies have created a harmful product by design. It does not take a stretch to see how companies are also failing children when it comes to protecting children from the risk of cell phone radiation.

We expect future lawsuits to expand beyond screen time and platform design to include injuries to children from wireless radiation exposure. It is only a matter of time.”