Press releases

Council for Responsible Social Media co-chairs applaud Meta whistleblower; call for congressional action

By bravely coming forward, Arturo Bejar has once again demonstrated that Meta Intentionally and knowingly puts profits over kids' safety, and why Congress must hold them accountable


Media Contact

Cory Combs

Director of Media Relations

Last night, former Facebook Engineering Director Arturo Bejar revealed internal documents and decision points to the Wall Street Journal demonstrating that, once again, the tech giant has ignored the wellbeing of children and teens in pursuit of profits.

During Bejar’s time at the platform, he discovered that the platform uses harm detection and classification systems that dramatically underestimate the rates that Instagram and Facebook users, especially children and teens, experience harms like bullying, sexual harassment and predation, and exposure to eating disorder or self-harm content. While Meta’s internal systems made the company look like a leader in ensuring user safety and satisfaction; Bejar uncovered the truth simply by asking users worldwide about their real experiences.

These disclosures further emphasize why Big Tech companies like Meta cannot be trusted to act in the best interest of users — including and especially children — and why Congress must take action by passing the Kids Online Safety Act (KOSA, S.1409), the Platform Accountability and Transparency Act (PATA, S. 1876), and other laws to hold these companies accountable. KOSA would make kids’ safety by design a requirement, rather than an afterthought, and PATA would give the public access to the kinds of internal data and decision points that Arturo risked his career to expose.

“While the bravery of Arturo Bejar cannot be diminished, these disclosures shouldn’t come as a surprise to anyone,” said Kerry Healey, the former Republican lieutenant governor of Massachusetts and co-chair of Issue One’s Council for Responsible Social Media (CRSM). “What more evidence do we need that Big Tech platforms like Meta exploit our children’s developmental vulnerabilities for commercial gain? They will not change unless Congress forces them to. That is why we must pass KOSA.”

Spurred by his own daughter’s experience on Instagram, Bejar brought these revelations to the attention of Meta leadership, including Mark Zuckerberg, Chief Operating Officer Sheryl Sandberg, and Instagram head Adam Mosseri. Instead of acting on Bejar’s findings, Meta ignored proposed design-focused fixes, shut down Bejar’s research, and fired most of the team behind it.

Bejar’s concerns echo the revelations that emerged from the tens of thousands of internal Meta documents released in 2021 by Frances Haugen, a member of the CRSM. Those documents, which formed the basis for the Wall Street Journal’s Facebook Files, revealed that Meta targeted children and teens with troubling ads and damaging content, spurring what has become a national youth mental health crisis. Meta’s own researchers found that Instagram’s business model and algorithmic feed created “a perfect storm” of eating disorders, body dissatisfaction, loneliness, and depression in teenage girls.

“It shouldn’t take a brave employee risking their career for the American public to learn how these platforms operate, and what impact they’re having on our kids, communities, and democracy,” said Dick Gephardt, former House Majority Leader and co-chair of the CRSM. “For the last two decades, Congress has thrown up its hands and said ‘we trust you’ to Big Tech, despite overwhelming evidence that these companies have never acted in our best interests. Now is the time for Congress to put kids before Big Tech profits. Now is the time for fundamental transparency and accountability measures to ensure these companies are no longer allowed to grade their own homework.”

According to a new poll commissioned by Issue One, 67% of Americans believe Congress needs to do more to hold Big Tech companies accountable for the harms caused by their social media platforms. The poll also showed Americans from both parties, 76%, overwhelmingly agree that social media companies have a responsibility to design their platforms in a way that protects the mental health of children, even if these practices limit profits.

Background:

The Kids Online Safety Act (KOSA) is the leading, bipartisan bill to protect children and teens online. It currently has 47 cosponsors with a nearly even split between Republicans and Democrats, making it an extremely rare case of bipartisan leadership. KOSA is currently awaiting a full vote by the Senate.

KOSA disables addictive product features, gives children, teens, and families the ability to opt out of algorithmic recommendations, and enables the strongest settings by default. The bill also gives policymakers and the American public more visibility into how these platforms work by requiring annual, independent auditing and giving researchers access to critical datasets about usage by and harms to kids.

KOSA has also been endorsed by hundreds of advocacy and technology groups, including Issue One, the Council for Responsible Social Media, Fairplay, Common Sense Media, Design It For Us, Accountable Tech, Eating Disorders Coalition, American Psychological Association, and the American Academy of Pediatrics.

The Platform Accountability and Transparency Act — proposed by Senators Chris Coons (D-DE) and Bill Cassidy (R-LA) — would create a framework for independent, vetted researchers to analyze the design and operations of the major social media and AI platforms.