However, K.G.M. v. Meta & YouTube has shaken the world of social media because it establishes the precedent that the social media platforms are dangerous and defective products. K.G.M. is the first bellwether case to go to trial in the California Judicial Council Coordinated Proceedings (JCCP 5255) and sets a significant precedent for nearly 2,500 other lawsuits pending in California courts. Both Meta and YouTube have announced their intention to appeal.
But social media addiction lawsuits are exploding throughout the U.S. The lawsuits cite a variety of claims, beyond those cited in K.G.M. Some countries, like Australia, Denmark and Greece, have already banned the use of social media for children under a certain age. Others, especially those included in the E.U., are considering measures, as well.
When a six-year-old is left alone to scroll
K.G.M., also called Kayle at trial, is now 20. She started using YouTube at 6 years old and Instagram at 9. Her lawsuit claimed that features like infinite scroll and algorithmic recommendations were addictive and caused her to suffer from depression, anxiety, body dysmorphia, and suicidal ideation. The California jury found that the social media platforms were dangerous and addictive to young users because of their attention-grabbing design.
The jury determined that the platforms were liable for two reasons. They had:
- negligently designed the platforms; and
- failed to warn consumers of the dangers.
TikTok and Snapchat, which were originally named as defendants, reached settlements with the plaintiff before the trial began. The details of the settlements were not disclosed.
Eight more bellwether trials scheduled. The jury verdict is expected to spur settlement negotiations.
Tobacco lawsuits are the model
U.S. law strongly protects social media companies from liability for what is on their platforms. Social media companies have long argued that lawsuits, like K.G.M., are barred by Section 230 of the 1996 Communications Decency Act (CDA).
The CDA, an arguably well-meaning statute, was designed to regulate online pornography. However, its indecency provisions were struck down by the Supreme Court in 1997 in Reno v. ACLU. The Court found that the law, as written, was overly broad and violated the First Amendment. All that remained was Section 230, which shields online platforms from liability for content posted by users.
Plaintiffs, in the latest round of lawsuits, have taken a different tack – using as their template the product liability tobacco lawsuits of the 1990s. In their telling, “Social Media Isn’t Just Speech. It’s Also a Defective, Hazardous Product.” Reduced to its basics, their argument is that the dopamine hit that young users (aka children) receive from likes, comments, and infinite scroll views are like the stimulation that smokers get from nicotine.
Exploding lawsuits, international action
In addition to the California state court cases, federal litigation has been consolidated in the Northern District of California. The MDL, In re: Social Media Adolescent Addiction/Personal Injury Products Litigation, names Meta Platforms, Instagram, Snap, TikTok, ByteDance, YouTube, Google, and Alphabet.
Like K.G.M., the MDL plaintiffs allege that the defendants’ social media platforms are defective because they are designed to maximize screen time, which can encourage addictive behavior in adolescents. This conduct, they claim, causes various emotional and physical harms, including death.
The MDL includes lawsuits brought by state Attorneys General alleging harm to their states, as well as cases by school districts that say social media addiction has caused costly disruptions and problems. These include claims brought by more than 1,000 school districts, teenagers, and families who allege that social media use has contributed to increased anxiety, depression, self-harm, and classroom disruption among minors.
In a separate lawsuit, a New Mexico jury found that Meta Platforms violated state consumer protection law and ordered the company to pay $375 million in civil penalties. The jury determined that the platforms enabled child sexual exploitation on Facebook, Instagram and WhatsApp and misled users about their safety.
At least 20 states enacted laws last year on social media usage and children, according to the National Conference of State Legislatures.
It’s not just U.S.
In 2024, Australia passed the Social Media Minimum Age Bill 2024, banning children under 16 from social media, with potential penalties for platforms. The European Union’s Digital Services Act (DSA) compels large platforms to mitigate risks of addictive design (e.g., autoplay, streaks) and protect minors, with TikTok already under scrutiny. Denmark has enacted legislation to ban social media usage for children under 15. Even China requires platforms to include a “minor mode” to manage screen time and combat addiction, particularly in children.
Source link
