Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

When my 16-year-old son Mason faced a painful breakup, he turned to TikTok, as many young people do. Like countless others in his generation, he sought positive affirmations and uplifting quotes for solace.
Unfortunately, TikTok’s algorithm directed him toward distressing content that encouraged self-harm and suicide. This led to a tragic event in November 2022 when Mason took his own life.
The chilling discoveries I made on his phone continue to haunt me. It is essential that TikTok and all social media companies face consequences for the harm inflicted by their algorithms and designs.
Recently, I learned of a disturbing development in Congress. Without public debate, lawmakers are quietly paving the way for a sweeping federal moratorium. This moratorium would block states from enacting laws aimed at protecting children from harmful, addictive algorithms.
This legislative measure is concealed within a massive technology package, progressing swiftly and with minimal scrutiny. As a result, most Americans remain unaware of the serious implications at stake. If this moratorium passes, it would essentially prevent states from enforcing any laws related to AI, specifically those aimed at regulating automated decision-making.
Such a move would obstruct bills designed to empower parents, enforce safe default settings, require age verification, and limit addictive design features. It would effectively suspend laws like Florida’s Online Protections for Minors Act, which aims to restrict minors’ access to harmful materials online.
In addition, it would impede New York’s Stop Addictive Feeds Exploitation for Kids Act, which prevents companies from creating addictive data-driven algorithmic feeds for children and teenagers without parental consent.
Likewise, it would halt progress on Utah’s Minor Protection in Social Media Act, which mandates that companies not recommend or share minors’ social media accounts and data to users with whom they are not connected.
The dangers these technologies pose are all too real. For instance, consider Megan Garcia, a grieving mother whose teenage son died after being targeted by AI-generated content. Countless families deal with the aftermath of deepfakes used in sextortion, or concern over children forming isolating relationships with AI companions that promote self-harm.
Currently, AI is utilized to manipulate, addict, and deceive young people. Yet, there are no adequate guardrails in place to protect them.
Proponents of a federal AI framework may assert that it is on the horizon. Yet, we have heard similar claims regarding privacy, data protection, and content moderation for two decades. After 20 years of promises that the solution is just around the corner, our children continue to suffer. This is not a justification for stripping states of their rights.
It is alarming that Congress has not enacted a single law aimed at protecting children online over the past 25 years. Now, legislators want to remove the rights of states that have taken steps to secure children’s safety, all while portraying the AI development as a global race that supersedes the welfare of our youth.
Concerned voices argue that a patchwork of state laws will stifle innovation. Yet, the core issue here is accountability. Big Tech companies largely resist having to follow 50 different sets of rules because they desire no rules at all. At the same time, they have found allies within Congress who support silencing states that are willing to take action on behalf of families.
This moratorium is not merely an attempt to preempt sensible legislation aimed at securing the safety of children. It represents an extensive concession to Big Tech, justified in the name of innovation and profit while disregarding the toll on human lives.
Importantly, there is a bipartisan amendment in discussion that seeks to eliminate this moratorium. Such a measure would halt this reckless overreach and allow states to act when federal lawmakers fail. It is crucial for all members of Congress to support this amendment. The urgency is clear: we must prevent another avoidable loss of life due to online threats. Protecting children is not a political issue; it is fundamentally a moral and humanitarian concern.
History has shown what happens when Congress permits Big Tech to self-regulate. The result is inaction. Our children continue to suffer, and families like mine are left grappling with the aftermath.
We demand real protections for our children, rather than backroom maneuvers that undermine the efforts of dedicated individuals who genuinely seek to protect kids—namely, parents, advocates, and state leaders. This proposed moratorium is nothing more than a smokescreen for Big Tech, allowing them to operate unchecked. We must mobilize against this legislation and ensure it is not enacted.