Meher Sethi argues that a little-noticed provision in the federal budget recently passed by the House will gut state laws protecting consumers from algorithmic price-fixing.
TTucked deep within Congressional Republicans’ sweeping federal budget package—the so-called “One Big Beautiful Bill Act”—lies a small but explosive provision that could dramatically reshape the landscape of artificial intelligence (AI) regulation in the United States. The original measure authored in the House of Representatives effectively nullifies all state and local laws governing AI for the next ten years under the guise of promoting a pro-innovation, laissez-faire approach to AI development. The Senate parliamentarian has ruled that a modified version of the provision qualifies for inclusion in the budget reconciliation process, meaning it can pass with a simple majority. If enacted, it would dismantle hard-won progress at the state and municipal level on some of the most pressing risks posed by AI—from algorithmic discrimination in hiring and healthcare to restrictions on deceptive deepfakes.
One significant yet largely unnoticed consequence of the provision is that it would tear down some of the only effective fortifications thus far erected against algorithmic collusion, whereby competing firms coordinate their prices not in the proverbial “smoke-filled backroom,” but through a shared software system. Five municipal ordinances in large cities have been instituted to ban collusive rent-setting software in the real estate industry and similar measures have been introduced in virtually every state legislature across the country. Meanwhile, case-by-case adjudication has faltered and federal legislation has failed to rally substantial political support.
Americans are facing an affordability crisis. In 2025, 57% of Americans are living paycheck-to-paycheck. Meanwhile, the Republican budget bill handicaps state initiatives to protect citizens from a phenomenon credibly linked to rising rents and higher grocery bills. For an administration that purports to champion American workers and consumers, this gift to code cartels is disappointing. Lawmakers who claim to stand for working families and free markets should strike this bill before its consequences become entrenched and irreversible.
The pervasiveness of code cartels
Algorithmic collusion manifests in a few different ways. It can be as straightforward as firms explicitly agreeing to use a software algorithm to calculate and implement a price-fixing scheme. In the 2015 case Topkins v. United States, the Department of Justice secured a guilty plea from online wallpaper sellers who’d agreed to use software to determine and set collusive prices in an otherwise naked conspiracy.
Or it can take the form of more subtle, noncommunicative conduct. In its 2023 lawsuit Federal Trade Commission v. Amazon, the FTC accused Amazon of deploying an algorithm called Project Nessie to predict exactly when its rivals would match price hikes on its platform. Project Nessie effectively facilitated precise and sustainable tacit collusion, as Amazon could raise prices for all parties knowing no party would try to undercut the other.
Yet the most widespread and potent form of algorithmic collusion is “hub-and-spoke” algorithmic collusion, where competing firms (the “spokes”) rely on a common decision-making algorithm (the “hub”) to set firms’ strategies and terms of trade—most notably, prices. RealPage, a provider of rental property management software, faces antitrust lawsuits from private plaintiffs, state attorneys general, and the DOJ for facilitating collusion among landlords. Its flagship software, “AI Revenue Management,” ingests real-time, competitively sensitive, nonpublic data on leases, tenants, and rental unit characteristics by making over 50,000 monthly calls to surveil over 11 million rental units. It then uses that data to generate daily rent pricing recommendations for rival landlords operating in the same metropolitan housing markets.
These pricing instructions are not merely suggestive. By default, they’re auto-implemented via an “auto-accept” feature. If property managers want to override them, they are required to submit detailed business justifications. These justifications are reviewed by “pricing advisors” at RealPage who monitor adherence and escalate noncompliance to regional leadership. Unsurprisingly, RealPage’s price “suggestions” have a 90% adoption rate. The result: landlords comfortably hike rents, engineered by a central authority, knowing that their competitors—also RealPage clients—won’t undercut them.
RealPage executives are refreshingly candid about their goal: “driving every possible opportunity to increase price.” As one vice president put it: “There is greater good in everybody succeeding versus essentially trying to compete against one another.” White House economists estimated that RealPage inflates rents by $100 to $200 per month in high-demand cities like Atlanta, Denver, Dallas, and DC., costing renters across the United States at least $3.8 billion in 2023. When asked about the role he thinks RealPage has played in the almost 14.5% rent increases in some markets, RealPage VP Andrew Bowen responded, “I think it’s driving it, quite honestly.”
This phenomenon isn’t confined to rental housing. The DOJ recently filed a Statement of Interest supporting a private lawsuit against a RealPage doppelganger in the health insurance industry: MultiPlan—recently rebranded as Claritiv amid mounting legal scrutiny. This code cartel is the modern reincarnation of a prior conspiracy that operated from 1996 to 2008, when major insurers used a shared pricing database run by UnitedHealthcare, known as Ingenix, to suppress payments for out-of-network care. That scheme resulted in provider underpayments of up to 28%, until it was exposed and dismantled through a hefty settlement sum and an agreement to abandon the database—for just five years.
Once that injunction expired, MultiPlan hired many of the same executives who ran the Ingenix operation and launched an algorithmic hub-and-spoke pricing system called Data iSight. Like its predecessor, the system centralizes pricing decisions to suppress out-of-network reimbursement rates on behalf of the majority of U.S. insurers, including the fifteen largest in the country. In 2023 alone, Data iSight coordinated underpayments to providers by an estimated 61% to 81%, amounting to $22.9 billion in suppressed reimbursements.
These distortions leave patients to shoulder the difference, driving up out-of-pocket costs for out-of-network care. The result has been the closure of many medical practices and reduced access to life-saving treatment. Even in-network providers are harmed: with out-of-network options weakened, their bargaining leverage erodes, pushing their own rates downward.
Code cartels are replicating this playbook across markets for essential goods and services. In agriculture, Agri Stats—the second-most-sued company in the industry—aggregates confidential, competitively sensitive data from 90% of U.S. pork processors, 95% of turkey processors, and 97% of chicken processors. It then distributes hundreds of pages of detailed reports, often paired with pricing “consulting” that one processor summarized in four simple words: “Just raise your price.” For consumers, that means inflated meat prices at the grocery store: During the alleged conspiracy, profit margins in turkey processing tripled within three years and wholesale pork prices surged by more than 50% over a five-year period following nearly a decade of stable pricing. The vehicle tire and potato industries face similar allegations, each relying on centralized algorithmic tools to coordinate pricing across competitors without the need for a traditional agreement.
From housing to healthcare to groceries, code cartels are at the helm of the affordability crisis in America.
Algorithmic collusion at court
Litigation against code cartels has largely come up short. Over a century of antitrust doctrine has trained courts to look for an explicit “agreement” among competitors—think wiretaps capturing shady meetings in smoke-filled backrooms, like the hundreds of hours of cartel talk recorded by Mark Whitacre, the executive-turned-FBI-whistleblower (played by Matt Damon in The Informant!) who helped expose a global price-fixing conspiracy in the lysine industry in the 1990s.
However, third-party algorithms make such communication obsolete, blurring the traditional meaning of “agreement” and complicating its application. In the first case of its kind, Meyer v. Kalanick, the plaintiff alleged that Uber engaged in hub-and-spoke price-fixing through its algorithm, which sets prices for drivers, who are technically independent contractors. The case was sent to arbitration, where the arbitrator dismissed the claim on the almost comically thin grounds that Uber drivers “are a diverse lot, not generally knowing each other’s names or identities.” Common pricing by a platform like Uber may have redeeming efficiency justifications, but the arbitrator’s reasoning made no inquiry into those or any competitive effects, instead relying on the formalistic markers of a traditional conspiracy. Two recent private lawsuits echoed that logic, involving casino hotels in Las Vegas and Atlantic City that allegedly used a common algorithmic price-setting tool called Rainmaker. Despite factual allegations that Rainmaker incorporated competitor pricing in jointly maximizing profits for rival firms, the presiding judges dismissed both cases on the grounds that the plaintiffs failed to prove how competitors came to an explicit “agreement”: “who, did what, to whom (or with whom)… and when.”
These decisions have misread into the Sherman Antitrust Act a myopic interpretation of “agreement” aligned with the traditional common law of conspiracies. The word “agreement” is written nowhere in any antitrust statute, and the Supreme Court articulated in American Tobacco v. United States that “no formal agreement is necessary to constitute an unlawful conspiracy.” Nonetheless, the traditional insistence on proving explicit agreement has been jurisprudentially sound. But that calculus shifts in the context of algorithmic collusion. Here’s why.
The legal command of the Sherman Act is clear: protect competition. Judge Robert Bork helpfully defines competition as “any state of affairs in which consumer welfare cannot be increased by judicial decree” (emphasis added). Therefore, the traditional judicial approach to supracompetitive parallel conduct absent evidence of agreement—or tacit collusion—is sensible: while an undesirable outcome from the point of view of consumer welfare, no practicably enforceable injunctive relief can remedy that harm. Is the court to police the innocuous act of observing a rival’s prices and responding strategically? Is it to impose judicial price controls? This approach would be even more objectionable and distortionary than the anticompetitive harm created by traditional tacit collusion. Hence, traditionally, courts have been best-off relying on evidence of an explicit agreement to find firms liable for collusion.
Notice, though, that algorithmic collusion breaks this logic. It introduces a tractable mechanism for enforcement and remediation against collusive outcomes. When an algorithm orchestrates collusive pricing, the algorithm itself becomes a focal point for remedial intervention—not through controlling prices directly, but by regulating the underlying software and structural relationships that enable collusion.
Clarifying the law through legislation
Given the courts’ inertia in applying the Sherman Act’s core logic as authored by Congress to this novel phenomenon, Congress should clarify the law on code cartels for the sake of guaranteeing consistency with the principles of competition. In 1914, Congress passed the Clayton Act and the Federal Trade Commission Act to strengthen the antitrust laws in response to what it viewed as enfeebling judicial interpretations of the Sherman Act. A similar legislative intervention is warranted today.
That is precisely the aim of the Preventing Algorithmic Collusion Act, introduced by Senator Amy Klobuchar in 2024. The bill would presume the existence of an “agreement” when an algorithm, trained on nonpublic data, sets prices on behalf of multiple competitors in a given market. Rather than adhering to a formalistic conception of “agreement,” the bill embraces a functional definition—one articulated by the Supreme Court in Copperweld v. Independence Tube Corp., which held that an “agreement” exists when conduct “deprives the marketplace of the independent centers of decision making that competition assumes and demands.”
A centralized entity—whether an AI algorithm or a guy named Bob—that directly sets prices for rival firms to maximize joint profits tautologically satisfies that definition. Hence, the bill rightly presumes “agreement” in treating such hub-and-spoke code cartels as illegal per se—that is, illegal regardless of any purported efficiency justifications, as all naked cartels are. This approach restores the core logic of the antitrust laws to meet the realities of today’s markets.
As the Supreme Court emphasized when it crystallized the per se rule against cartels in United States v. Socony-Vacuum Oil Co.: “Any combination which tampers with price structures is engaged in an unlawful activity… to the extent that they raised, lowered, or stabilized prices they would be directly interfering with the free play of market forces.” The very essence of competition is that firms devise strategies by exercising independent judgment in response to market forces. Hub-and-spoke code cartels subvert that competitive process.
Moreover, Senator Klobuchar’s bill promises a consistent standard across industries for a pervasive species of conduct which is anticompetitive across industries. And for these overwhelming benefits, the bill lingers in the Judiciary Committee—backed by just eight cosponsors.
The merits of state and local legislation
Until Congress acts, state and local laws following Klobuchar’s model can showcase its effectiveness and build momentum for federal adoption. More importantly, they must, for while Congress stalls, renters, patients, and grocery shoppers pay the price. States and municipalities are our most effective line of defense in the interim.
In August 2024, the San Francisco Board of Supervisors unanimously passed the first municipal ordinance banning algorithms trained on nonpublic competitor data—such as RealPage’s AI Revenue Management software—from setting rents for competing landlords: a model of the Preventing Algorithmic Collusion Act tailored for the rental housing industry. The White House has estimated that RealPage’s anticompetitive coordination inflated rents by an average of $62 per month for over 20% of renters in San Francisco—renters who can finally keep that money in their pockets. Similar ordinances have been approved in Jersey City, Minneapolis, Philadelphia, and San Diego and are under consideration elsewhere. RealPage is estimated to have hiked rents an average of $27 per month for over 15% of renters in Minneapolis; $47 per month for over 20% of renters in Philadelphia; and $99 per month for over 20% of renters in San Diego. Colorado’s state legislature also passed a similar measure, though Governor Jared Polis vetoed it, costing over half of renters in Denver a staggering $136 per month due to RealPage’s presence alone. These measures largely only target algorithmic rent-fixing in the rental housing industry, despite the general pervasiveness of the problem. That is not the case, however, with the most recent bill passed by the California legislature: a statewide ban on all algorithmic price-setting software systems which utilize nonpublic data. The bill copycats Senator Klobuchar’s Preventing Algorithmic Collusion Act andnow awaits action from the Governor.
The Republican budget bill: rolling back the clock
By precluding these state-level initiatives, the Republican budget bill throws code cartels a “legal lifeline.” The original provision passed by the House of Representatives barred states and cities from enacting any regulations on “AI models” or “automated decision systems” for ten years, with the exception of laws principally meant to remove legal barriers to AI development. To bypass the filibuster and clear the Senate with a simple majority via the budget reconciliation process, the provision has been modified to condition federal infrastructure funding on states maintaining the state AI law moratorium—functionally achieving the same result. Recognizing the grave danger the bill poses, a bipartisan coalition of 40 state attorneys general wrote a letter urging Congress to strike the AI state law moratorium from the bill.
By stripping away protections—and even the long-term possibility for protections—against blatantly anticompetitive behavior facilitated through algorithms, Congressional Republicans’ bill paves the way for higher prices on housing, healthcare, food, and other essential goods and services. Any lawmaker who claims to stand with working families should vote no.
Author Disclosure: The author reports no conflicts of interest. You can read our disclosure policy here.
Articles represent the opinions of their writers, not necessarily those of the University of Chicago, the Booth School of Business, or its faculty.
Subscribe here for ProMarket‘s weekly newsletter, Special Interest, to stay up to date on ProMarket‘s coverage of the political economy and other content from the Stigler Center.