Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

For more than a decade, children and their families have struggled to seek justice against the detrimental effects of social media. From anxiety and depression to eating disorders, substance abuse, and in heartbreaking cases, even death, the consequences of social media use have become increasingly evident. Starting this week, these families are finally gaining a voice in the courtroom.
This past Tuesday marked the commencement of the first bellwether trial against prominent social media companies in Los Angeles, a pivotal moment that serves as an early test for over 5,000 lawsuits currently lodged against companies like Meta, TikTok, Snap, and YouTube. These lawsuits encompass more than 3,000 cases in California alone, not to mention thousands of additional cases at the federal level.
Significant evidence is surfacing in this monumental case. An internal message exchange among Meta employees has drawn comparisons between Instagram and addictive substances. One employee remarked, “Oh my gosh y’all IG is a drug,” while another quipped, “Lol, I mean, all social media. We’re basically pushers.” Such candid communications expose the troubling awareness within these companies regarding their products’ effects on youth.
This trial is unprecedented. It provides an opportunity to scrutinize the internal decisions made by social media companies that prioritize profit over the welfare of minors. The long-standing protections enjoyed by these companies under Section 230 of the Communications Decency Act, which shields them from liability for content hosted on their platforms, are now facing a critical challenge.
The current wave of litigation takes a novel approach. Instead of attributing harm to harmful content or excessive screen time, these lawsuits argue that the very design features of social media platforms contribute to addiction. The lawsuits do not accuse parents of allowing their children too much screen time; rather, they assert that these platforms are intentionally designed to be addictive without adequate warnings about their potential harms.
The features being scrutinized include infinite scrolling, autoplay, recommendation algorithms that entrap minors in lengthy usage sessions, push notifications, and “likes.” These elements work together to create dopamine-driven feedback loops that keep users engaged, often leading to negative mental health outcomes.
As similar lawsuits against the tobacco industry and opioid manufacturers have demonstrated, the essential question in this trial is straightforward: Did these companies knowingly design their platforms to be highly addictive for children, failing to issue warnings about the associated dangers?
Critics of these lawsuits insist that proving causation is fraught with difficulties. They argue that it’s challenging for victims to establish a direct link between their harms and social media use, given the complex factors at play, including personal experiences and personality traits.
However, similar assertions were historically made against Big Tobacco and opioid manufacturers, which claimed addiction arises from a myriad of external influences. Despite these arguments, the outcomes of those litigations included substantial settlements for countless individuals harmed by these industries. Observers speculate that social media lawsuits may follow a similar trajectory.
Recent unsealed documents have disclosed shocking details about how social media companies, including Meta, Google, Snap, and TikTok, designed their products with the deliberate intention of capturing and retaining the attention of youth. Internal communications, strategic presentations, and private studies reveal a troubling focus on the financial benefits of young users. One Meta report noted, “the lifetime value of a 13-year-old is roughly $270 per individual,” emphasizing how crucial this demographic is for sustained engagement.
Additionally, data from a Meta internal study on teen mental health echoed alarming sentiments, stating, “Teens can’t switch off from Instagram even if they want to.” It underscores a troubling narrative describing compulsive behaviors linked to the platform.
The revelations emerging from this trial have prompted public concern. Social media companies like Snap and TikTok quickly settled before the trial commenced, likely in an effort to keep potentially explosive information from becoming public knowledge. This spectacle signals a reckoning for how these platforms have operated.
Parents are stepping in where lawmakers have faltered. Despite the absence of comprehensive child online safety legislation in the U.S. since 1998, other countries like Australia have enacted social media bans for minors, while France and the UK are exploring similar measures. In the U.S., it appears that parents, backed by legal action, will try to hold social media companies accountable.
This trial stands as a crucial moment of truth for the tech industry, echoing the historical battles against Big Tobacco. The evidence is mounting that social media platforms not only have a responsibility but also an obligation to prioritize the safety and mental health of young users. As the trial unfolds, the public awaits the outcome, hopeful for a safer digital landscape for future generations.