The Kids Online Safety Act: A Critical Overview
The Kids Online Safety Act (KOSA) is proposed federal legislation aimed at forcing social media and online platforms to protect minors from digital harms. In essence, KOSA would impose a “duty of care” on covered platforms — including social media, messaging apps, online games, and streaming services — requiring them to implement safeguards for users under 17. Platforms would be expected to exercise reasonable care in the design and use of features that increase minors’ engagement in order to prevent and mitigate potential harms such as severe harassment and mental health risks.
Background
The bill would mandate default privacy settings and parental control tools for young users, while also restricting platforms from conducting marketing research on children under 13 without parental consent. Another major provision focuses on algorithm transparency, requiring platforms that use personalized recommendation systems to notify users and provide an option to switch to a non-personalized feed. Enforcement would be handled by the Federal Trade Commission and state attorneys general. KOSA has bipartisan roots but a turbulent history in Congress. The proposal gained traction following public concern over the mental health effects of social media on teens, particularly after whistleblower disclosures in 2021 highlighted internal research linking social media use to anxiety, depression, and self-harm. Senators Richard Blumenthal of Connecticut and Marsha Blackburn of Tennessee first introduced the bill in early 2022. Momentum grew during the 118th Congress, when the Senate Commerce Committee advanced the bill and the Senate later approved it by an overwhelming margin. Despite bipartisan Senate support and signals from the White House backing child-safety legislation, the House of Representatives did not bring the bill to a vote before adjourning, leaving it stalled. In the 119th Congress, KOSA was revived with revised versions introduced in both chambers. By late 2025, a House subcommittee advanced a new draft alongside several other online-safety bills. While negotiations continue, the path forward is clear: KOSA must pass both the House and Senate and be signed by the President before becoming law.
Key Players
KOSA’s primary Senate sponsors are Senator Marsha Blackburn (R-TN) and Senator Richard Blumenthal (D-CT). In the House, the bill has been championed by Representatives Gus Bilirakis (R-FL) and Kathy Castor (D-FL). The legislation has drawn broad bipartisan support, though disagreements remain over its scope and constitutional implications. During House committee consideration in late 2025, Democrats raised concerns about weakened protections and federal preemption of state laws, while Republicans argued for a narrower bill designed to withstand legal challenges. Child-Safety Advocates (Supporters): A coalition of parents, public-health advocates, and child-safety organizations has strongly supported KOSA. Groups representing families affected by online harms have framed the bill as a necessary response to rising rates of anxiety, depression, and online exploitation among young people. Supporters argue that KOSA represents the most serious congressional effort in decades to hold tech platforms accountable for their impact on youth mental health. Tech Industry: The technology sector’s response has been mixed. Some major companies and industry leaders have publicly endorsed KOSA, positioning it as a reasonable framework for improving online safety. Others, including industry trade groups, have expressed concern about vague standards and the risk of excessive regulation. Lawmakers from both parties have criticized the industry — some arguing platforms have failed to protect children, others warning that companies may exert undue influence over how the law is written. Civil Liberties and Tech Advocacy Groups (Opponents): Privacy advocates, free-speech organizations, and digital rights groups have emerged as some of KOSA’s most vocal critics. They argue the bill’s broad definitions could pressure platforms to over-censor lawful content in order to avoid liability. Critics warn that this could disproportionately affect marginalized communities and suppress information related to mental health, sexual education, or identity. Conservative and libertarian organizations have echoed these concerns, arguing that the bill creates a de facto censorship regime by imposing an unrealistic liability standard on platforms.
What Happens Next
As of late 2025, KOSA remains under active consideration but is far from becoming law. In the Senate, the latest version of the bill sits with the Commerce Committee, where advocates are pushing for another vote. In the House, the Energy and Commerce Committee advanced a revised version of the bill, though it must still clear the full committee and survive a floor vote. Significant differences remain between the House and Senate versions. The Senate bill retains a broad duty-of-care standard covering a wide range of mental and emotional harms, while the House version narrows its focus to more explicit dangers such as physical violence, sexual exploitation, and drug-related activity. Supporters of the narrower approach argue it avoids constitutional issues, while critics say it undermines the bill’s original purpose. Political dynamics will play a decisive role. House leadership has shown skepticism toward sweeping online-content regulation, and internal GOP divisions could delay or derail a floor vote. If the bill advances in both chambers, a conference committee would likely be needed to reconcile differences in 2026. If it stalls, supporters may attempt to revive its provisions through other legislation.
Why It Matters
The debate over KOSA goes beyond child safety alone. At its core, the bill raises fundamental questions about free speech, privacy, and who should decide what content is appropriate for young people online. While supporters argue the law would finally force platforms to take responsibility for known harms, critics warn it could have sweeping unintended consequences. By broadly defining harmful design features, KOSA could encourage platforms to censor or suppress lawful content that merely risks engaging minors too heavily. Civil liberties advocates argue this creates strong incentives for overreach, chilling speech that is protected but controversial. Concerns are especially pronounced for vulnerable groups. LGBTQ+ advocates and mental health organizations worry that content related to identity, sexuality, or emotional well-being could be restricted if platforms view it as legally risky. Past experiences with online regulation suggest that companies often err on the side of removal rather than nuance, leading to the loss of valuable support resources. Privacy is another major concern. While the bill avoids explicit age-verification requirements, enforcing its provisions would require platforms to collect more data about users’ ages and behavior. Critics argue this contradicts the bill’s stated goal of protecting minors by expanding data collection and reducing anonymity online. Finally, there are questions about effectiveness. History shows that children frequently bypass age restrictions, and sweeping regulations may simply push risky behavior into less visible spaces. While there is broad agreement that online harms are real, opponents argue that KOSA prioritizes political signaling over practical solutions. Ultimately, KOSA sits at the intersection of child welfare and constitutional rights. Its outcome could reshape how social media platforms operate and set a lasting precedent for internet regulation. As lawmakers continue debating the bill, the challenge remains balancing genuine concerns for child safety with the preservation of free expression and privacy in the digital age.