Policy Deep Dive: Online Safety for Children
N2K logoMar 13, 2025

Policy Deep Dive: Online Safety for Children

Listen to the audio version of this story.

Policy Deep Dive: Online Safety for Children

In this special policy series, the Caveat team is taking a deep dive into key topic areas that are likely to generate notable conversations and political actions throughout the next administration. This limited monthly series focuses on different policy topic areas to provide you with our analysis of how this issue is currently being addressed, how existing policies may change, and to provide thought-provoking insights. 

For this month's conversation, we’re focusing on the Online Safety of Children. We’ve seen substantial changes over the past several years and we assess what key topic areas still need to be addressed.

To listen to the full conversation, head over to the Caveat Podcast for additional compelling insights. 

Key insights.

  1. KOSA and COPPA 2.0. While not formally passed, both laws would significantly overhaul how children are treated online.
  2. Government Overreach. Despite growing support for reining in social media platforms, critics have cited how various measures could infringe on free speech and minors' rights.
  3. State Efforts. Due to federal inaction, states have begun to establish precedents for handling minors online.
  4. Securing Online Safety. While some efforts have been made at the state level to address this issue, these efforts will likely continue to be a hotly contested issue throughout 2025.

Managing Children’s Online Presence.

For several years, pressure has continued to mount, calling on lawmakers to increase protections for children and regulate social media companies.

Last year, the pressures on lawmakers to address the dangers of social media came to a head after Senate members met with the Chief Executive Officers (CEOs) of several major platforms. During this hearing, lawmakers from the Senate Judiciary Committee heard troubling testimony from parents and children alike about how these various platforms directly harmed them with Senators across the political spectrum calling for reforms. These reforms covered a variety of issues such as limiting data collection on minors, implementing stronger parental controls, and increasing the duty of care that these companies have when protecting children.

While troubling, these testimonies reflect a growing movement to rein in social media companies that many feel are exploiting children for increased engagement and profits regardless of what harm they are exposed to. Furthermore, these pressures have already begun to yield tangible results, with notable support backing both the Kids Online Safety Act (KOSA) and the Children’s Online Privacy Protection Act 2.0 (COPPA). 

While each of these bills is complex and has numerous intricacies, it is critical to understand their core aspects. KOSA was created in 2022 by Senators Blackburn and Blumenthal and has five key components:

  1. The creation of a duty of care for Internet companies to be held liable for recommending harmful content to children.
  2. Requiring social media platforms to introduce more options to protect children’s information, disable addictive features, and opt out of personalized algorithmic recommendations.
  3. Requiring social media companies to limit others from communicating with minors.
  4. Ensuring that minor accounts are set to have the safest possible settings by default.
  5. Allowing the Federal Trade Commission (FTC) to sue tech companies over misconduct rather than leaving the responsibility to State Attorney Generals.

Outside of KOSA, COPPA 2.0 functions as an update to a 1998 law and would amend and introduce the following components:

  1. Expanding COPPA’s protections to cover minors under seventeen rather than thirteen.
  2. Mandating a “data minimization” requirement that directs companies to collect only the information they need ensures services are functional.
  3. Banning targeting advertising to minors unless explicit consent is given from parents.
  4. Allowing parents the right to erasure gives them the ability to request the deletion of the child’s collected personal information.
  5. Increasing parental rights which would allow them to obtain information about how their children have interacted with social media sites.

While slightly different, these two measures represent the two most comprehensive federal attempts to date to address children’s safety on the internet. However, despite this momentum and the fact that both of these bills were passed by the Senate last June in a ninety-one to three vote, these bills have seen remarkably little traction since entering the House of Representatives.

Thinking Ahead: 

What potential issues would critics have with these measures and how could Congressional representatives address these concerns?

The Roadblocks to Child Safety.

Despite significant bipartisan support, critics have raised notable concerns about government overreach and censorship.

Since reaching the House, KOSA and COPPA 2.0 have remained stagnant with Speaker of the House, Mike Johnson, expressing notable concerns. For example, Speaker Johnson commented on how “when you’re… dealing with the regulation of free speech, you can’t go too far and it will be overbroad.”  Speaker Johnson’s views echo many similar concerns found across the political spectrum that believe these bills could potentially impact free speech and result in the federal government overstepping its legal authority. While these concerns are multi-faceted, they can be summarized at a high level as being related to:

  • Impacts on free speech
  • Real-world effectiveness
  • Potential for government overreach

Outside of Speaker Johnson, the American Civil Liberties Union (ACLU) echoed similar views. Jenna Leventoff, a senior policy counsel, commented on these measures, emphasizing how the bills “compound nationwide attacks on young people’s right to learn and access information, on and offline.” Leventoff continued her criticism, stating that “the last thing students and parents need is another act of government censorship deciding which educational resources are appropriate for their families.”

The Electronic Frontier Foundation (EFF), a non-profit digital rights group, echoed a similar stance. In their criticism, the EFF emphasized how KOSA’s core component, its duty of care provision, would likely “lead to broad online censorship of lawful speech.” Moreover, the EFF cited two notable Supreme Court Cases, Smith v California and Bantam Books, Inc. v Sullivan, where the court struck down efforts that would have prevented booksellers from distributing certain speech to demonstrate how KOSA would most likely be ruled unconstitutional.

Outside of censorship concerns, the EFF also raised concerns related to the real-world effectiveness of KOSA. The EFF argued how they believed “in practice, KOSA is likely to exacerbate the risks of children being harmed online because it will place barriers on their ability to access lawful speech about addiction, eating disorders, bullying, and other important topics.” Furthermore, the EFF compared KOSA to another bill, known as FOSTA/SESTA, which aimed to prevent harms related to human trafficking, but in practice proved to be both ineffective and created additional dangers for victims.

One of the last key points of contention revolves around enforcement. As it stands, KOSA and COPPA 2.0 would empower the FTC to regulate online platforms and manage harmful content. However, critics have pointed out how this dynamic could create significant imbalances regarding enforcement. The Yale Journal on Regulation analyzed this dynamic, noting how the FTC could not be trusted to be a dispassionate and scholarly enforcer. Furthermore, given how vague the language is in these bills it will make it highly subjective what is considered harmful to a minor. Coupling this ambiguous language with the FTC’s inherent biases, it should not be surprising that critics have concerns about how different administrations could broadly censor different forms of online speech.

While it is uncertain if KOSA and COPPA 2.0 can be revised to better address these concerns, it is clear that these criticisms are not unfounded nor are they exclusive to one side of the debate. Given this federal gridlock, states have begun to take the initiative when it comes to solving this problem by implementing legislation that aims to tackle many of the problems that these federal efforts have failed to address. 

Thinking Ahead: 

What changes can be made to either of these laws to help reduce critics’ concerns?

States Establishing Precedent.

With the lack of any substantial federal policies, various states have begun to pave the way by imposing their versions of online child safety measures.

Given how both KOSA and COPPA 2.0 have hit a standstill at the federal level, state legislatures have taken the initiative to lead these regulatory efforts. Out of all of the state bills, perhaps the most impactful are Florida’s HB 3 and California’s S 976

HB 3, otherwise known as the “Online Protection for Minors Act,” was signed into law last year in Florida and is considered to be one of the strictest online child safety laws. While it is unclear how the state will enforce the bill, the new law mandates several major regulations for social media companies and minors online. These efforts include the following:

  • Banning all children under fourteen from creating social media accounts.
  • Requiring children under sixteen to have parental consent to create social media accounts.
  • Requiring age verification systems be in place to prevent minors from accessing harmful materials.
  • Requiring age verification systems on platforms with explicit content for those under eighteen.

Like KOSA and COPPA 2.0, this bill’s creation was motivated by efforts to reduce the dangers to minors online and empower parents to better protect their children. When signing the bill, Florida’s Governor Ron DeSantis stated how “ultimately, [we’re] trying to help parents navigate this very difficult terrain that we have now with raising kids.”

In conjunction with HB 3, California passed a similar bill, S 976, otherwise known as the “Protecting Our Kids from Social Media Addiction Act.” While the two bills were rooted in similar premises, they approached the problem differently. Through S 976, California implemented the following measures:

  • Mandating parental consent to allow minors to use addictive feeds.
  • Implementing time restrictions for when notifications can be sent to minors.
  • Ensuring that minors’ accounts are automatically sent to private by default.

Outside of HB 3 and S 976, another major move involved a bill that was passed just last week in Utah. Utah’s SB 142, better known as the App Store Accountability Act, is set to take effect May 7th later this year. With this law, Utah would require app stores to verify users’ ages and receive parental consent for minors to download applications. After the bill was signed into law, Utah state Senator Todd Weiler stated that he was “pleased that the majority of [his] colleagues…voted to protect children from accepting sometimes predatory terms and conditions when downloading apps that may collect and sell their personal data.”

Alongside these three bills, there have been several other notable state initiatives. These include the following state legislation:

  • SB 351: Georgia’s “Protecting Georgia’s Children on Social Media Act” requires education boards to design and implement policies that would better address cyberbullying and how social media companies adversely impact a minor’s mental health. Additionally, these measures would also mandate that children under sixteen have parental permission to create a social media account.
  • LB 1092: Nebraska’s “Online Age Verification Liability Act” requires commercial entities to use reasonable age verification methods before showing content to minors that could be considered harmful. 
  • SB 396: Arkansas’s “Social Media Safety Act” requires social media companies to implement age verification measures before allowing people to create accounts and mandate users under eighteen to have parental consent to create an account.

While each of these state laws is limited in scope and jurisdiction, each of them has helped create a legal precedent that could help establish a foundation to spur greater efforts at the federal level. Depending on which state laws prove to be the most effective and which can survive inevitable legal challenges will likely influence which measures are going to be considered at the federal level.

Thinking Ahead: 

How can these state measures influence federal efforts?

Protecting Minors or Legal Rights.

The balance between ensuring children’s online safety and protecting the right to free speech is a difficult paradigm that will likely not be fully solved soon.

When assessing the current state of protecting children’s online safety, some clear gaps have been attempted to be addressed at the state level and still need more comprehensive momentum at the federal level. However, despite these obvious needs, the hurdles that are involved when it comes to addressing critical issues like free speech and enforcement are substantial and are unlikely to be easily rectified

Nonetheless, it is unlikely that this issue will be neglected for long as Republican leadership has signaled that this issue is a priority for them in 2025. For example, despite his criticism of the bills, Speaker Johnson emphasized his willingness to continue working to improve the bills. Alongside Speaker Johnson, other key Republicans have also voiced their support for the initiative, including Elon Musk and Donald Trump Jr., who emphasized their belief that the bills’ goals could be achieved without hindering free speech if revised. If these bills were to be amended to resolve the existing criticisms, the following provisions would most likely need to be addressed:

  • Providing stronger and more narrowly defined definitions of harmful content.
  • Removing the FTC’s power to force platforms to moderate specific forms of content.
  • Ensuring that harmful content is better monitored and addressed equally for online platforms rather than solely blaming social media platforms.

Given these challenges, it is clear that President Trump will have an uphill battle when it comes to finding solutions to these concerns while still ensuring that KOSA and COPPA 2.0 can be effective once implemented. 

Thinking Ahead: 

What changes could a Trump-led government implement to pass comprehensive online child safety laws?