More and more countries are passing data protection laws, yet empirical studies show that these laws rarely deliver on their promises. A new paper explains why and proposes three principles Congress should consider when debating its new proposed privacy bill. 


The proposed American Data Privacy and Protection Act bill (ADPPA), currently being debated in Congress, has brought discussions on a potential federal data protection law back into the spotlight. If the bill passes, the US would join a large number of countries to have passed similar bills—by one account, 157 countries have now passed comprehensive data protection laws, the majority of them following the tenets of the European Union’s General Data Protection Regulation (GDPR).

Yet, while many of these laws are strong on paper, their track-record is not great (to say the least). In a recent article titled “Narrowing Data Protection’s Enforcement Gap,” I comprehensively reviewed empirical studies that measured the real-world impacts of the GDPR and the California Consumer Privacy Act (CCPA). Out of twenty-five analyses, none found meaningful legal compliance. Examples range from a 2019 academic survey that concluded that 85 percent of Europe’s most accessed websites maintained or increased tracking even after the users opted-out to a recent survey finding that only 0.001 percent of California consumers made use of access rights granted by the Law.

The reason for this somewhat dismal performance is that laws like the GDPR and the CCPA contain severe flaws in the design of their enforcement mechanisms. Many rightfully attack the Irish Data Protection Commission—the body that supervises much of the enforcement of the GDPR in Europe. Indeed, the authority is reluctant to use its overseeing powers to take on Facebook, Google and other large tech companies that contribute significantly to the Irish economy. However, the problem is much bigger: the enforcement systems of many data protection laws consistently ignore the ways in which information asymmetries and market power undermine the role of markets and the effectiveness of tort lawsuits and public regulation in ensuring legal compliance.

Many personal data markets are very opaque and concentrated. Consumers have no idea how companies collect and process their data and many of these markets are controlled by one or a handful of large tech companies. These characteristics pose a serious problem for enforcement—markets in general only work if consumers can understand the relative price/quality ratios in products that collect their personal data. Otherwise, consumers cannot take advantage of the traditional options of exit (switching suppliers) and voice (complaining to management) as strategies to force companies to deliver products that comply with their preferences. The same can be said for tort lawsuits: if consumers and their lawyers cannot identify problems in products/services or link them to recognizable legal harm, torts will not be an effective tool to force companies to comply with the law. 

While public enforcement is another important mechanism, it cannot ensure legal compliance alone. The expansive jurisdiction of data protection laws—which company does not collect personal data nowadays?—means that privacy regulators have an almost endless workload. A simple comparison shows the magnitude of the task: it took European data protection regulators 18 months to issue the same amount of EU-wide cooperation requests that their antitrust counterparts issued in more than fourteen years (around 2500 requests). Between 2016 and 2019, the same data protection agencies received 275.000 individual complaints—by then, their backlog had already amounted to around a hundred thousand unresolved cases.

“the flaws in the design of data protection laws are not insurmountable. Rather, they demonstrate why those who design online privacy regimes must acknowledge that data markets suffer from such large information asymmetries and market power problems.”

As Congress debates the American Data Privacy and Protection Act, legislators must keep in mind that the priority should be in passing a data protection law that actually delivers. In my paper, I argue that the flaws in the design of data protection laws are not insurmountable. Rather, they demonstrate why those who design online privacy regimes must acknowledge that data markets suffer from such large information asymmetries and market power problems, and build enforcement systems that anticipate these problems. 

Three design principles can help increase the effectiveness of data protection laws.

For starters, the system must multiply monitoring and enforcement resources, ensuring that private parties complement regulators in market oversight. Sophisticated civil-society intermediaries, such as privacy NGOs, independent think tanks, investigative journalism outlets, and class-action plaintiffs, play an outsized role in protecting consumers in opaque and complex markets. These organizations have both the incentives and the capacity to understand the complexity of data collection and denounce violations—let’s use them to our advantage. 

Empowering civil-society intermediaries starts at the enactment of a strong private right of action, but goes further. For the system to work, it must be adequately funded. Yet, most privacy NGOs and other similar organizations are currently supported by donations, which are irregular and limited in scope. We need to complement these donations with public grants that enable these intermediaries to invest time and resources in hiring technical personnel and starting complex investigations. These grants can be funded by the fines collected by agencies—indeed, the California Privacy Rights Act foresees exactly that, but the amount awarded is quite limited (3 percent of total fines). Even a relatively straightforward data privacy litigation can cost millions of euros/dollars. Civil society representatives will be tasked with investigating some of the most sophisticated, richest, companies in the world. These grants should amount to dozens of millions of dollars per year.

Further, data protection laws should establish a dedicated whistleblowing regime that pays whistleblowers for successful denunciations. Antitrust and anti-corporate fraud policies have long relied on whistleblower programs to encourage insiders to reveal wrongdoing. Yet, these do not really exist in data privacy. Insiders revealed many data privacy scandals—from Facebook’s Cambridge Analytica to Amazon’s security shortfalls—and did so despite the fact that they are regularly punished for taking such brave action. Research has shown that paying for rightful denunciations increases the quality of reports and leads to better detection of fraud. We need to ensure that whistleblowers are rewarded for their courage rather than live in an eternal fear of punishment. 

Finally, data privacy regulators must be much more accountable to civil society. The market power of large digital platforms, the complexity/opacity that characterizes many data markets and the fact that personal data is important for national security and industrial policy (among others) increases the risks that regulators end up promoting the interests of the industry rather than the interest of consumers—as the ineffective Ireland Data Protection Commission well exemplifies. Most of the actions of data protection agencies, including opening and closing investigations, should be transparent by default, with narrow exceptions for cases where secrecy is paramount. Data protection agencies should be examples of transparency, not laggards. 

More and more governments around the world are deciding that data protection laws are here to stay, and hopefully the United States will soon join the club. But first, we need to improve the design of these laws to ensure that they lead to meaningful improvements in people’s data privacy.

Read our disclosure policy here.