Addicted by Design: Reassessing Section 230 in the New Era of Social Media Addiction Litigation

by Mofe Koya, Associate Member, University of Cincinnati Law Review Vol. 94

I. Introduction

Across the country, parents, students, and entire school districts are taking Social-Media giants to court, alleging that the architecture of these platforms is fueling a youth mental health crisis.1Gregg Goldfarb, Digital Addiction Litigation Tests Product-Liability Limits, Bloomberg Law (Nov. 5, 2025, 4:30 EST), https://www.bloomberglaw.com/bloomberglawnews/artificial-intelligence/BNA%200000019a0756d391a1bf377e32760001?bna_news_filter=artificial-intelligence [https://perma.cc/DPE2-FUHK]. Attorneys who represent the harmed parties in these cases often argue that social media platforms like TikTok, Instagram, and Snapchat exploit the use of addictive algorithms and design features.2Maria Curi, Inside the massive social media addiction case, Axios (Aug. 21, 2025), https://www.axios.com/pro/tech-policy/2025/08/21/inside-the-massive-social-media-addiction-cases [https://perma.cc/ZM5Y-45EL]. In using features “such as endless scrolling, algorithmic alerts, and reward-based engagement systems” social media platforms have discovered how to keep users online for longer resulting in serious mental health issues in school-aged users.3Goldfarb, supra note 1. Historically, Section 230 of the Communications Decency Act has provided general immunity to internet service providers for content generated by their users.4Danny Tobey et al., Navigating the digital dilemma: Court addresses social media liability in adolescent addiction litigation, DLA Piper, (Jan. 11 2024), https://www.dlapiper.com/en-us/insights/publications/2024/01/navigating-the-digital-dilemma-court-addresses-social-media-liability-in-adolescent-addiction [https://perma.cc/KQ3D-D2LS]. However, as social media technologies and usage evolve, courtrooms across the country are facing a novel issue: are social media platforms generating tangible harm through algorithm design choices? And if so, can they be held legally liable?5Goldfarb, supra note 1.

This Article will examine how social media addiction litigation is reshaping the boundaries of product liability and Section 230 immunity. Part II will first trace the development of this emerging area of tort law and then explain how plaintiffs have reframed traditional product-liability principles to challenge social media platforms for harmful design practices. Part III will then analyze the distinction between conduct and content that courts have begun to draw, the medical community’s ongoing debate over whether social media addiction constitutes a legitimate injury, and the broader policy implications of holding technology companies accountable for the psychological harms their products may cause. Finally, this Article will argue that by treating algorithmic design as actionable conduct rather than protected speech, courts are responding appropriately to the evolving nature of social media, not overstepping into scientific territory.

II. Background

The evolution of social media can be traced back to the 1970s with the introduction of Bulletin Board Systems (“BBS”).6Jessica Jenkins, The Rise of Social Media Addiction, Mental Health Wellness, (Nov 25, 2023), https://www.mentalhealthwellnessmhw.com/blog/social-media-addiction [https://perma.cc/XM28-CLLU]. BBS allowed users to “connect via modems, post messages, and share files” and “marked the birth of online communities.”7Id. By 1997, the website “SixDegrees.com,” regarded as the first social media platform, introduced the concept of creating profiles, listing friends, and cultivating an online persona.8Id. When Facebook introduced the “news feed” in 2004, it transformed social media from a static profile-based network into a constantly updating ecosystem designed to capture attention.9Id. Twitter and Instagram, launched in 2006 and 2010 respectively, expanded the social media ecosystem by popularizing “microblogging” and the instantaneous sharing of real-time updates and visual content.10Id. In 2016, TikTok, first introduced in China as Douyin, entered the global market and redefined user engagement by combining short-form video creation with a highly personalized, algorithm-driven recommendation system.11Id. The journey from BBS to TikTok reflects the remarkable evolution of digital spaces but has also opened the door to a newly recognized form of addiction among vulnerable users.

A. The Rise of Social Media Addiction and Litigation

1. Social Media Addiction

Social media addiction is “a behavioral addiction that is characterized as being overly concerned about social media, driven by an uncontrollable urge to log on to or use social media, and devoting so much time and effort to social media that it impairs other important life areas.”12Jenkins, supra note 7; Jena Hilliard, Social Media Addiction, Addiction Ctr., (July 28, 2025), https://www.addictioncenter.com/behavioral-addictions/social-media-addiction/ [https://perma.cc/C6PH-SVJ5]. Although this type of addiction is not an officially recognized clinical disorder, addictive social media use looks like any other substance use disorder and may include: mood modification, negative effects on tolerance, withdrawal symptoms, and relapse.13Hilliard, supra note 13.

The phenomenon of social media addiction is thought to be attributed to the dopamine that social networking sites provide.14Id. Studies have shown that social media websites “produce the same neural circuitry that is seen in those with a gambling addiction and recreational drug users.”15Id. It is the “constant stream of retweets, likes, and shares” from social media platforms that causes the brain’s reward center to trigger the “same kind of chemical reaction seen with drugs like cocaine.”16Id. A study by Harvard University found:

The reward area in the brain and its chemical messenger pathways affect decisions and sensations. When someone experiences something rewarding or uses an addictive substance, neurons in the principal dopamine-producing areas in the brain are activated and dopamine levels rise. Therefore, the brain receives a “reward” and associates the drug or activity with positive reinforcement.17Id.

Thus, social media usage provides an endless amount of immediate rewards in the form of attention from others for relatively minimal effort.18Id. Creators of social media platforms use this to their advantage, creating algorithms that purposely entice users to come back to get more hits of dopamine.19Social Media Addiction – A Growing Problem Among Children & Teens, Soc. Media Victims L. Ctr. (Oct. 2, 2025), https://socialmediavictims.org/social-media-addiction/ [https://perma.cc/5V38-ZZX7]. Usage of social media in this way becomes problematic when an individual begins to use this “reward system” as a coping mechanism to relieve stress, loneliness, or depression.20Hilliard, supra note 13. Individuals who rely on social media to replace real-world rewards often enter a cycle of mood regulation that strengthens psychological dependence. The neurological effects of addiction make it difficult to break this pattern, even when the behavior produces harmful outcomes.21Id. Misuse of social media in this way is more problematic in children and young adults because their brains are still developing.22Id. Research shows that adolescents who use social media regularly from a young age have severely stunted social skills.23Id. These children have worse “social anxiety in groups, higher rates of depression, negative body-image, and lowered levels of empathy and compassion towards others.”24Id.

2. Section 230 Concerns

Opposing parties in digital addiction claims often cite Section 230 of the Communications Decency Act of 1966 (“CDA”). Section 230 of the CDA generally provides immunity to providers of interactive computer services for third-party content generated by its users.25Tobey et al., supra note 5. The law was originally intended to protect moderation and free expression online and “promote the free development of the internet, while also ‘remov[ing] disincentives’ to implement ‘blocking and filtering technologies’ that restrict ‘children’s access to . . . inappropriate online material.’”26Valerie C. Brannon & Eric N. Holmes, Cong. Rsch. Serv., R46751, Section 230: An Overview (2023). However, critics of the statute argue that courts have interpreted Section 230 immunity too broadly.27Id. This debate has now extended into the realm of digital addiction litigation. In November 2023, Judge Yvonne Gonzalez Rogers dismissed several claims in the Social Media Adolescent Addiction Multidistrict Litigation (“MDL”) under Section 230 but allowed those specifically alleging defective design and failure to warn to proceed.

In a Bloomberg Law article published in November 2025, personal injury attorney, Gregg Goldberg, noted that while Section 230 continues to provide broad immunity, its scope has narrowed.28Goldfarb, supra note 1. Goldberg observes that though claims targeting a platform’s failure to verify users or moderate posts still often fail under Section 230, those targeting “inherently addictive product design … or reward mechanics, are moving forward, especially when evidence shows intentional or reckless disregard for child safety.”29Id. Moreover, “[a]ttempts by Meta, Google, and Snap to force interlocutory appeals have met resistance, with courts arguing the public interest demands speedy trials and factual discovery.”30Id.

3. Social Media Addiction Litigation

Digital addiction litigation, also known as social media addiction litigation, gained national recognition in October 2022 with the formation of In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation. The litigation, consisting of 28 actions pending in 17 districts, included a plaintiff class who argued that several social media platforms including Meta Platforms, Snap Inc., TikTok Inc., and YouTube LLC, are “defective because they are designed to maximize user screen time, which can encourage addictive behavior in adolescents.”31In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, 637 F. Supp. 3d 1377, 1378 (J.P.M.L. 2022). The moving parties alleged further that, “defendants were aware, but failed to warn the public, that their platforms were harmful to minors.”32Id. The Judicial Panel on Multi-district Litigation found that the actions involved common factual questions and thus centralization would “eliminate duplicative discovery; prevent inconsistent pretrial rulings … and conserve the resources of the parties, their counsel, and the judiciary.”33Id. The Panel assigned the Northern District of California to be an appropriate transferee district for the litigation and assigned Judge Yvonne Gonzalez Rogers to preside over the action.34Id. at 1379. Since then, Judge Gozalez Rodgers has consistently held that social media companies may not claim broad immunity under Section 230.35Edvard Pettersson, Meta fails to knock out states’ claims that Facebook, Instagram are addictive for young users, Courthouse News Serv. (Oct. 15, 2024), https://www.courthousenews.com/meta-fails-to-knock-out-states-claims-that-facebook-instagram-are-addictive-for-young-users/ [https://perma.cc/4F5P-23TH]. This Article argues that Judge Gonzales Rogers’ interpretation is not only correct, but essential in light of society’s rapidly evolving relationship with technology.

Digital addiction litigation has since developed into a coherent area of emerging tort law. Attorneys frame these claims within classic product-liability doctrine, contending that social media’s addictive features amount to a product defect that companies failed to warn consumers about.36Goldfarb, supra note 1. The Social Media Victims Law Center (“SMVLC”), founded by plaintiff injury attorney, Matthew Bergman, in 2021, uses the concept of product liability to hold companies responsible for “how they design their social media platforms, how they put their algorithms together, and how these algorithms and platforms addict children to their … social media platforms.”

III. Discussion

The ongoing wave of digital-addiction litigation challenges courts to reconsider the boundaries of both Section 230 immunity and traditional product-liability doctrine. As plaintiffs seek to hold social media companies accountable for the psychological harms associated with their design choices, courts are being asked to decide whether algorithmic architecture should be treated as protected speech or actionable conduct. This Section examines how judicial interpretation of the conduct-versus-content distinction has begun to narrow Section 230’s reach; how the unsettled science surrounding social media addiction complicates the recognition of a compensable injury; and, finally, what policy reforms may be necessary to balance innovation with accountability.

A. Conduct vs. Content

Attorneys arguing in favor or holding social media companies liable for harm caused by their platforms have a strong argument not barred by Section 230. Their successful claims focus not on content posted by users but conduct of the platforms themselves. The order issued by Judge Gonzalez Rodgers mentioned above, found that “allegations targeting defendants’ role as publishers of third-party content fell within Section 230’s immunity provisions … include[ing] features such as providing endless content, distributing ephemeral content, and the timing and clustering of third-party content.”37Tobey et al., supra note 5. In contrast, claims based in “conduct or the creation or development of content did not escape plaintiffs’ statutory negligence per se claims and were not protected by Section 230 … [such claims include] defendants’ failure to offer robust parental controls and the timing and clustering of notifications of the defendants’ own content, among others.”38Id.

This reasoning was echoed two years later by Los Angeles Superior Court Judge Carolyn B. Kuhl, who clarified the principle stating that: “Section 230 does not apply as long as the plaintiffs refrain from seeking to hold the provider liable for allowing that content to exist.”39Madlin Mekelburg, Social Media Giants Lose Challenge to Experts Testifying on Harm, Bloomberg Law, (Sept. 22, 2025, 20:29 EDT), https://www.bloomberglaw.com/bloomberglawnews/artificial-intelligence/BNA%200000019973fcde8aa7f9f3fc5b4e0001?bna_news_filter=artificial-intelligence [https://perma.cc/ECV7-9GU9]. The distinction between conduct and content acknowledged by the courts suggest a judicial willingness to hold social media companies accountable for the consequences of their design decisions. This interpretation not only narrows the reach of Section 230 but also recognizes the legitimate effects of social media usage exploitation in the modern digital world. In treating algorithmic features as actionable conduct rather than protected speech, courts are acknowledging the fact that judicial decisions and legal liability must recognize the quickly changing nature of harm in the digital age. This shift signals a growing consensus that legal accountability must evolve as technology advances, ensuring that companies cannot rely on broadly applied and sometimes outdated laws to escape responsibility for harmful design practices that have foreseeable impacts on vulnerable users, particularly children and adolescents.

Understanding the distinction between conduct and content at the core of social media addiction litigation clarifies how these claims fit within the traditional product-liability framework. In the Social Media Adolescent Addiction MDL, Judge Gonzalez Rogers held that several alleged design defects such as inadequate age verification, insufficient parental controls, failure to label filtered images, and needlessly complex account deactivation processes could render social media platforms products whose design decisions give rise to potential legal liability under traditional theories of product liability like negligence.40Steven M. Selna & Brinson Elliot, The Intersection of Social Media, AI, and Product Liability, Benesch Law, (July 11, 2025), https://www.beneschlaw.com/resources/the-intersection-of-social-media-ai-and-product-liability.html [https://perma.cc/GG3U-D4C2]. This recognition marks a pivotal step in extending product-liability principles to digital platforms, yet it also raises a fundamental question: can social media addiction itself be treated as a legitimate, compensable harm?

B. The Science Behind the Harm

Critics of the Social Media Adolescent Addiction MDL point out that the court skipped an important foundational inquiry: whether frequent use of social media is an actual addiction, and if so, whether such an addiction is a compensable injury. The medical community has raised concerns about overdiagnosing social media addiction, a condition not recognized by the Diagnostic and Statistical Manual of Mental Disorders (“DSM”), and about the flawed comparison between social media use and substance-based addictions.41Christopher Gismondi & Allen Waxman, Courts Must Stick To The Science On Digital Addiction Claims, DLA Piper, (Jan. 28 2025), https://www.dlapiper.com/en-us/insights/publications/2025/01/courts-must-stick-to-the-science-on-digital-addiction-claims [https://perma.cc/V978-22CR]. Physicians caution that courts should proceed carefully before allowing digital addiction claims to move forward, reminding us of former U.S. Circuit Judge Richard Posner’s observation that “the courtroom is not the place for scientific guesswork, even of the inspired sort. Law lags science; it does not lead it.”42Id.

Although the medical community’s concerns are well-founded, this Article contends that in recognizing liability for harmful design decisions, courts are not attempting to lead scientific inquiry but respond to it. In permitting relevant research to be introduced as evidence, courts are encouraging reasoned and informed decision-making in litigation. While the science underlying social media addiction remains developing, the existing evidence demonstrates a significant correlation between usage and adverse mental health effects that courts have been willing to acknowledge.

Across the country, courts have become increasingly receptive to expert testimony from pediatricians, neuroscientists, and behavioral psychologists. Recently admitted evidence has included statistical analyses linking school absenteeism and clinical referrals to algorithmic changes.43Goldfarb, supra note 1. For example, New York’s largest school district reported record-high levels of digital-related counseling referrals in the six months following an Instagram algorithm update that amplified viral challenge videos.44Id. Additionally, attorneys have been allowed to bring “in experts from multiple fields, human factors, addiction medicine, and child psychology, not only to showcase causation but also to ‘teach the jury,’ as one trial attorney put it, “how a platform can become a compulsive tool.”45Id. While there remains progress to be made before social media addiction is formally recognized as a compensable harm resulting from negligence by social media companies, evidence does support the inference that social media use can produce real and measurable harm, even if that harm has not yet been officially classified as a manifestation of addiction. Accordingly, courts act well within reason in permitting such claims to move forward, recognizing that the evolving scientific record warrants continued judicial consideration.

C. Policy Implications

As courts and legislators continue to address the continuously changing nature of social media platforms and the resulting overuse by its users, the central policy challenge lies in finding a balance between innovation and accountability. Overly expansive liability could chill innovation by discouraging social media platforms from developing features that foster engagement. There may also be First Amendment implications if liability blurs the line between regulating platform design and regulating speech itself. In contrast, continued reliance on broad immunity offered under Section 230 risks leaving minors unprotected from foreseeable psychological harms of exploitative design choices.

In light of these competing considerations, several policy reforms are plausible. One proposal is a narrow statutory carve-out to Section 230 that limits immunity for design conduct while preserving protection for traditional editorial functions. This would clarify that platforms are not liable for what users post but may be accountable for how their products operate.

Notably, the rise in digital addiction litigation has already inspired positive reactions from social media platforms who have begun to roll out “usage nudge features, expanding parental controls, and launching public awareness campaigns about digital wellness.”46Id. Laws that require or even strongly encourage this type of social responsibility from companies would integrate concern for user safety and mental health into platform design from the outset rather than addressed only after harm occurs.

IV. Conclusion

Digital-addiction litigation marks a turning point in how the law understands both social media and corporate responsibility in the digital age. As courts confront claims that focus not on what users post but on how platforms are designed, Section 230’s once sweeping immunity is being tested against the realities of algorithmic influence and psychological harm. Although the science surrounding social media addiction remains unsettled, mounting evidence demonstrates that platform design can produce measurable, adverse effects—particularly among minors. By distinguishing conduct from content, courts have begun to adapt long-standing doctrines of product liability to a new technological context, acknowledging that as technology advances, the courts adapt alongside it to ensure that parties cannot conceal harmful conduct behind legal doctrines that no longer align with today’s realities. Ultimately, recognizing design-based liability does not stifle innovation; it ensures that technological progress proceeds responsibly. As society becomes increasingly shaped by digital environments, the law must continue to evolve. Not to punish creativity, but to safeguard the well-being of those who engage with it.


Cover Photo by Bruce Mars on Unsplash

References

Up ↑

Discover more from University of Cincinnati Law Review Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading

Skip to content