This week, the Federal Trade Commission (FTC) proposed blanket prohibitions that would prevent Meta — the parent company of Facebook, Instagram, and WhatsApp — from profiting from data it collects from users under the age of 18. In practice, this action means that these platforms would no longer be able to direct data-driven advertising to minors.
“This is a landmark action by the FTC,” said Council for Responsible Social Media Co-Chair and former U.S. House Majority Leader Dick Gephardt. “Meta’s business model relies on the pervasive and often imperceptible collection of user data, which is then used to hook our attention and keep us on these platforms longer. For kids, this model has had disastrous consequences, from widespread addiction to a growing loneliness epidemic. Meta has repeatedly shown that it cannot be trusted to act in the best interest of minors. I applaud the FTC for acting accordingly and ensuring that America’s children are protected from predatory practices.”
Kerry Healey, former Lt. Governor of Massachusetts and Co-Chair of the Council, added: “Today’s FTC proposal is a major step forward for oversight and accountability of Big Tech’s experiment on our children for profit. But we know that Meta is not alone in its extractive and manipulative data practices. That’s why Congress must establish comprehensive data privacy protections and meaningful children’s safety provisions, like the American Data Privacy and Protection Act (ADPPA) and the Kids Online Safety Act (KOSA), that apply to social media platforms in all 50 states, creating a stronger foundation on which we can build a safe and healthy online environment.”
According to the FTC, this proposed action comes after repeated violations by Meta of a 2020 privacy order. This is the third time the FTC has taken action against Meta (previously Facebook) for allegedly failing to protect users’ privacy. This document created by Fairplay, a member organization of the Council for Responsible Social Media, details Meta’s long history of failing to protect the safety and privacy of children and teens.
The FTC’s proposed enforcement of Meta would also limit the company’s allowed usage of facial recognition technology, expand existing privacy compliance requirements to merged companies, and put a pause on the launch of new products, services, and features without confirmation that these changes would align with the Meta-FTC privacy agreement.
The crosspartisan Council for Responsible Social Media, a project of Issue One, came together to advance reforms that will make social media platforms safer for children, secure for users’ privacy, and transparent to the oversight of policymakers and regulators. Learn more about the Council for Responsible Social Media.