Home The Role of the State Antitrust and Competition Why Competition Alone Won’t Bring About a More Inclusive Digital Economy

Why Competition Alone Won’t Bring About a More Inclusive Digital Economy

0
Illustration by LazingBee, via Getty Images

The current reforms being debated in the US and Europe to tackle the challenges posed by tech giants tend to see more competition as the ultimate cure. But more competition will not help when the competition itself is toxic, when rivals compete to exploit us in discovering better ways to addict us, degrade our privacy, manipulate our behavior, and capture the surplus.


At a 2018 congressional hearing, Facebook’s CEO was asked a simple yet revealing question:

“Would you be comfortable sharing with us the name of the hotel you stayed in last night?”

“Um,” Mark Zuckerberg said before a long pause, “No.” 

The point, of course, is that Facebook (now Meta) and a few other powerful firms know a lot about all of us. Within a few minutes, Facebook’s CEO could learn more about its 2.9 billion users (including their personalities, political attitudes, physical health, and any substance abuse), according to one study, than what their coworkers, friends, parents, or even spouses know.  But we know relatively little about what personal data Facebook collects, how it uses our data, and with whom it shares our data.

We are at the frontiers of the Panopticon, an architectural design conceived by the father of utilitarianism, 18th-century English philosopher Jeremy Bentham. Imagine a round tower lined with cells. In its center is the watchman. While the cells have transparent glass, the watchtower’s glass is tinted so that a single guard can watch any factory worker or inmate without them knowing they are being monitored. Today, those guards are the data-opolies who track us across the web, collect data about us, profile us, and manipulate us—to hold our attention and induce us to buy things we otherwise wouldn’t at the highest price we are willing to pay.

Is this simply paranoia? Consider a conversation Alastair Mactaggart had among friends at a social outing. The San Francisco real estate developer asked an engineer working for Google whether we should be worried about privacy. 

“Wasn’t ‘privacy’ just a bunch of hype?” Mactaggart asked. 

The Google engineer’s reply was chilling: “If people just understood how much we knew about them, they’d be really worried.”

Enforcers, policymakers, scholars, and the public are increasingly concerned about Google, Apple, Facebook, and Amazon and their influence. That influence comes in part from personal data. They’re “data-opolies,” in that they are powerful firms that control our data. The data comes from their sprawling ecosystems of interlocking online platforms and services, which attract users, sellers, advertisers, website publishers, and software, app, and accessory developers. 

The public sentiment is that a few companies, in possessing so much data, possess too much power. Something is amiss. In a 2020 survey, most Americans were concerned

  • about the amount of data online platforms store about them (85 percent); and
  • that platforms were collecting and holding this data about consumers to build out more comprehensive consumer profiles (81 percent). 

But data is only part of the story. Data-opolies use the data to find better ways to addict us and predict and manipulate our behavior.

While much has been written about these four companies’ power, less has been said about how to effectively rein them in. Cutting across political lines, many Americans (65 percent) in another survey think Big Tech’s economic power is a problem facing the US economy, and many (59 percent) support breaking up Big Tech. Other jurisdictions, including Europe, call for regulating these gatekeepers. Only a few argue that nothing should be done.

In looking at the proposals to date, including Europe’s Digital Markets Act and Congress’ five bi-partisan anti-monopoly bills, policymakers and scholars have not fully addressed three fundamental issues:

  • First, will more competition necessarily promote our privacy and well-being?
  • Second, who owns the personal data, and is that even the right question?
  • Third, what are the policy implications if personal data is non-rivalrous?

As for the first question, the belief among policymakers is that we just need more competition. Although Google’s and Facebook’s business model differs from Amazon’s, which differs from Apple’s, these four companies have been accused of abusing their dominant position, using similar tactics, and all four derive substantial revenues from behavioral advertising either directly (or for Apple, indirectly).

So, the cure is more competition. But, as my new book Breaking Away: How To Regain Control Over Our Data, Privacy, and Autonomy explores, more competition will not help in instances when the competition itself is toxic. Here, rivals compete to exploit us in discovering better ways to addict us, degrade our privacy, manipulate our behavior, and capture the surplus.

As for the second question, there has been a long debate about whether to frame privacy as a fundamental, inalienable right or in terms of market-based solutions (relying on property, contract, or licensing principles). Some argue for laws that provide us with an ownership interest in our data. Others argue for ramping up California’s privacy law nationwide, which the realtor Alastair Mactaggart spearheaded; or adopting regulations similar to Europe’s General Data Protection Regulation (GDPR). But as my book explains, we should reorient the debate from “Who owns the data” to “How can we better control our data, privacy, and autonomy.”  

Easy labels do not provide ready answers. Providing individuals with an ownership interest in their data doesn’t address the privacy and antitrust risks posed by the data-opolies; nor will it give individuals greater control over their data and autonomy. Even if we view privacy as a fundamental human right and rely on well-recognized data minimization principles, data-opolies will still game the system. To illustrate, my book explores the significant shortcomings of the earlier California Consumer Privacy Act of 2018 and Europe’s GDPR in curbing the data-opolies’ privacy and competition violations.

As for the policy implications of personal data being non-rivalrous, policymakers in the EU and US currently propose a win-win situation—promote both privacy and competition. Currently, the thinking among policymakers is with more competition, privacy and well-being will be restored. But that is true only when firms compete to protect privacy. In crucial digital markets, where the prevailing business model relies on behavioral advertising, privacy and competition often conflict. Policymakers, as a result, can fall into several traps, such as when in doubt, opt for greater competition.

Thus, we are left with a market failure where the traditional policy responses—define ownership interests, lower transaction costs, and rely on competition—will not necessarily work. Instead, we need new tools to tackle the myriad risks posed by these data-opolies and the toxic competition engendered by behavioral advertising. 

With so many issues competing for our attention, why should we care about data-opolies?

Power! As the data-opolies have refined their anticompetitive playbook and will eventually wield their prediction and manipulation tools to financial services, health care, insurance, and the metaverse, they’ll have all the cards.

Another reason to care about data-opolies: Blackmail. The game here isn’t simply to provide us with relevant ads. Instead, Facebook’s patented “emotion detection” tools would tap into your computer’s or phone’s camera to decipher your emotions to better determine your interests. The ultimate aim is to detect and appeal to your fears and anger and to pinpoint your children when they feel “worthless,” “insecure,” “defeated,” “anxious,” “silly,” “useless,” “stupid,” “overwhelmed,” “stressed,” and “a failure.” In a “massive experiment,” Facebook changed the newsfeed of 689,003 users with depressing or uplifting stories, without these users’ knowledge. It wanted to see how manipulating their users’ moods can be “transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” That wasn’t an isolated case. As my book explores, we are the lab rats, as we enter a marketplace of behavioral discrimination: data-opolies already know our personality, whether we have internal/external locus of control, our willingness to pay, and our impulsivity. And we have little choice but to enter this ecosystem, which they have primarily designed and now control.

Third is the toll of addicting us and manipulating our behavior. It is simply too great to ignore. Congress, in an extensive market inquiry of the power of Google, Apple, Facebook, and Amazon, found “significant evidence that these firms wield their dominance in ways that erode entrepreneurship, degrade Americans’ privacy online, and undermine the vibrancy of the free and diverse press.”  The stakes, as then FTC Commissioner (and current head of the Consumer Financial Protection Bureau) Rohit Chopra noted in 2019, are huge:

“The case against Facebook is about more than just privacy—it is also about the power to control and manipulate. Global regulators and policymakers need to confront the dangers associated with mass surveillance and the resulting ability to control and influence us. The behavioral advertising business incentives of technology platforms spur practices that are dividing our society. The harm from this conduct is immeasurable, and regulators and policymakers must confront it.”

If we continue along the current course, the result is less privacy, less innovation, less autonomy, greater division and rancor, and a threatened democracy. In short, as an influential 2020 congressional report observed, “[o]ur economy and democracy are at stake.”  We cannot afford remedies, which while well-intentioned, do not address the root of the problem. Nor can we ignore the looming privacy/competition clash, which some of the data-opolies are already exploiting.

The challenge then is to enact the privacy framework that attacks the source of the problem: the surveillance economy that a few powerful companies have designed for their benefit, at our expense. 

“We should decide, without penalty, the right to limit at the onset what data is collected about us and for what purpose.”

The good news, as the book explores, is that there are solutions that can promote our privacy, deter the toxic competition caused by behavioral advertising, and balance privacy and healthy competition when they conflict. The law should allow us to avoid being profiled, avoid having our data amalgamated, and avoid personalized recommendations. We should decide, without penalty, the right to limit at the onset what data is collected about us and for what purpose. A revitalized, updated legal framework can promote an inclusive digital economy that advances our privacy, well-being, and democracy. Our lives need not devolve to monetization opportunities.

Once we dismantle the Panopticon where almost every aspect of our lives—where we are, with whom we spend our time, how we spend that time, and whether we are in a romantic relationship—is tracked, predicted, and manipulated, we can harness the value from data to promote an inclusive economy that protects our autonomy, well-being, and democracy. In short, a nobler form of competition that brings out the best rather than preying on our worst.

Disclosure: The author thanks the University of Tennessee College of Law and the Institute for New Economic Thinking for the research grants for the book.

Learn more about our disclosure policy here.

Exit mobile version