Executive Summary
With Congress deadlocked on privacy legislation, statehouses have become the front line of data privacy policymaking. But instead of producing strong protections, many states have adopted laws shaped heavily by the technology industry. These laws appear protective while preserving companies’ ability to collect, infer, and monetize personal data at scale.
This report examines seven states — Washington, Virginia, Connecticut, Alaska, Utah, Montana, and Maine — and shows how major tech firms and their affiliated coalitions have used a consistent strategy to steer privacy laws toward a weak national standard.
The first part of that strategy is drafting the bills themselves. Lobbyists routinely supply full bill text and “technical” amendments that closely resemble the Virginia Consumer Data Protection Act, an industry-friendly model with no private right of action, narrow definitions of sensitive data, and attorney-general-only enforcement. And when lawmakers introduce stronger privacy-protective proposals, the industry works behind the scenes to reshape them. When stronger bills gain momentum, multiple competing alternatives suddenly appear, overwhelming limited staff capacity and dividing political support. Alongside these procedural tactics, the industry deploys a network of proxy groups that present corporate positions as neutral expertise: trade associations, think tanks, multi-industry coalitions, and “small business” alliances. State chambers of commerce and associations often echo talking points handed to them by the tech-funded U.S. Chamber of Commerce, giving national lobbying campaigns a local voice.
The seven case studies reveal how this playbook operates in practice. In Virginia, an Amazon lobbyist drafted the bill that became the industry’s preferred template. In Connecticut, a strong bill was steadily weakened after coordinated opposition from tech and healthcare coalitions. In Alaska and Utah, lobbyists overwhelmed small, part-time legislatures with minimal staff. Similar and related tactics were leveraged in Washington, Montana, and Maine, with varying success.
The negative consequences are significant. Weak state privacy laws leave children vulnerable to algorithmic profiling, allow data brokers to continue selling sensitive information to foreign nations, and fuel opaque AI systems that shape what people see, know, and believe. These are not just privacy harms—they affect public health, national security, and democratic participation.
Yet this trajectory is not inevitable. Lawmakers in several states have begun to resist the industry’s model, and public awareness is growing. This report offers recommendations for various stakeholders, arguing that strong privacy laws require strong processes: transparency about who writes the bills, limits on industry’s role in drafting them, and meaningful enforcement mechanisms. The rules governing the data economy must be written in public, for the public — not by the companies that profit from our personal lives.