Today, the Senate Commerce Committee voted unanimously to advance the Kids Online Safety Act (KOSA), setting the stage for a vote before the full Senate. Issue One applauds this critical step and released the following statement from Alix Fraser, director of the Council for Responsible Social Media:
“For far too long, families have been left alone to fight back against multibillion dollar social media companies who have designed and marketed addictive products to our children. But they can’t make social media safer by themselves, and parents and young people across the country are taking a stand and demanding action. Today’s advancement of the Kids Online Safety Act by the Senate Commerce Committee is a crucial step to put in place responsible safeguards to keep our children safe and give parents much-needed support by bringing greater accountability to the platforms.
“We thank senators for working in a bipartisan way to address the harmful outcomes caused by social media’s business model. Now we urge the full Senate to swiftly schedule a floor vote for KOSA and work with the House on final passage so this important piece of legislation can be sent to the president’s desk.”
The Kids Online Safety Act was advanced out of markup today in a unanimous bipartisan vote by the Senate Commerce Committee. The bill was reintroduced in May by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN). The bill currently has 41 co-sponsors in the Senate across both parties, making it one of the strongest bipartisan proposals before Congress right now.
KOSA has also been endorsed by hundreds of advocacy and technology groups, including Council members and partners like Common Sense Media, Fairplay, Design It For Us, Accountable Tech, Eating Disorders Coalition, American Psychological Association, and the American Academy of Pediatrics.
KOSA disables addictive product features, gives children, teens, and families the ability to opt out of algorithmic recommendations, and enables the strongest settings by default. The bill also gives policymakers and the American public more visibility into how these platforms work by requiring annual, independent auditing and giving researchers access to critical datasets about usage by and harms to kids.