A new proposal backed by bipartisan senators would require social media platforms to take additional measures to keep children under 16 safe online.
The Kids Online Safety Act, which will be introduced by Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) on Wednesday, builds on months of momentum in Congress for social media platforms to take accountability for risks posed to children — especially after a Facebook whistleblower released internal company documents.
The legislation calls for social media platforms to provide families with tools and additional transparency aimed at protecting children online.
For example, the legislation would require platforms to limit the ability of individuals to contact or find a minor, as well as prevent individuals from viewing a minor’s personal data collected by or shared on the platform.
It would also require companies to limit features that “increase, sustain or extend the use” of the platform by a minor, such as automatic playing options or "rewards" for time spent on the platform, and to allow minors to opt-out of algorithmic recommendation systems.
In some cases, platforms already have certain options for these features. The bill, however, would require that the “strongest settings” are enabled by default for minors.
The bill would also establish a duty of care for platforms to prevent and mitigate harm to minors, including the promotion of self-harm, suicide, eating disorders and substance abuse. The bill would require platforms to perform an annual audit assessing risks to minors and whether the platform is taking steps to prevent those harms.
Lawmakers have also bashed social media companies for a lack of transparency, especially surrounding their internal data on impact to teens and kids. The proposal would provide academic researchers and nonprofit organizations with access to data from platforms to research harms to the safety of minors.
“The Kids Online Safety Act would finally give kids and their parents the tools and safeguards they need to protect against toxic content—and hold Big Tech accountable for deeply dangerous algorithms. Algorithms driven by eyeballs and dollars will no longer hold sway. I will fight for swift passage alongside Senator Blackburn, my partner in this effort,” Blumenthal said in a statement.
Blumenthal has backed other bills in the past aimed at increasing safety for kids online, such as the Kids Internet Design and Safety Act he reintroduced with Sen. Ed Markey (D-Mass.) last year. But that legislation, unlike the new bill he introduced with Blackburn on Wednesday, does not have support across the aisle.
“In hearings over the last year, Senator Blumenthal and I have heard countless stories of physical and emotional damage affecting young users, and Big Tech’s unwillingness to change. The Kids Online Safety Act will address those harms by setting necessary safety guiderails for online platforms to follow that will require transparency and give parents more peace of mind,” Blackburn said in a statement.
Lawmakers have targeted Meta, the parent company of Facebook and Instagram, at times in the push to revamp kids online safety laws, with the Senate Commerce consumer protection subcommittee, chaired by Blumenthal. The panel held a series of hearings last year with the Facebook whistleblower, as well as hearings solely with executives from Facebook and Instagram.
But the panel has taken a broader, industry-wide look at concerns around kids safety, even pulling in executives from apps often not called in to testify before Congress, including TikTok and Snapchat.
The legislation would seemingly apply to those apps, as well, in addition to Meta’s dominant social media platforms. The legislation defines a “covered platform” as a commercial software application or electronic service that connects to the internet that is used, “or is reasonably likely to be used,” by a minor.