Nancy L. Rose and Jonathan Sallet respond to a recent article by Herbert Hovenkamp, in which he argues that the merger-efficiencies defense, which requires merging parties to demonstrate competitive benefits of a merger in order to rebut a prima facie case of harm presented by plaintiffs, is too burdensome and runs contrary to empirical evidence.
Summary Teaser: In a new working paper, Jakob Beuschlein, Jósef Sigurdsson, and Horng Chern Wong find that workers at acquired firms in Sweden experience wage cuts. Rather than from the increased monopsony power of employers, these wage cuts are due to rent redistribution toward higher CEO pay.
Matt Lucky reviews two new books exploring why digital platforms are failing users and how to rediscover the internet’s original promises of an abundance of high-quality and cheap services: Cory Doctorow’s Enshittification: Why Everything Suddenly Got Worse and What to Do About It and Tim Wu’s The Age of Extraction: How Tech Platforms Conquered the Economy and Threaten Our Future Prosperity.
In the second of two articles, Jeff Alvares analyzes the competing arguments around Pix under World Trade Organization rules—a debate involving broader questions about how international trade rules need to reflect the complexity of public services in the digital economy.
In the first of two articles, Jeff Alvares explores how Brazil’s public digital payments system achieved transformative financial inclusion through vertically integrated infrastructure, creating a model now facing scrutiny under international trade law and raising questions about the boundaries of legitimate public infrastructure provision.
In new research, Norman Bishara and Lorenzo Luisetto analyze the nature and proliferation of state legislative activity to regulate noncompete agreements since 2009. In the absence of a federal rule, these developments represent a promising step toward curbing the abuse of noncompete agreements.
Jasper van den Boom provides a synopsis of his new book, Regulating Competition in the Digital Network Industry, which comes out at Cambridge University Press in December. The book can be pre-ordered here.
The largest artificial intelligence firms are able to afford access to quality data from content producers like the New York Times, while smaller startups are being left out. This dynamic risks concentrating markets and creating unassailable barriers to entry. Compulsory licenses offer one solution to lower barriers to entry for nascent AI firms without harming content producers and consumers, writes Kristelia GarcÃa.
Christian Peukert argues that the market for licensing content from copyright owners like newspapers or online forums requires a standardized regime if access to this data, used to train artificial intelligence models, is to remain available for more than just the largest AI firms. A failure to maintain non-discriminatory access will result in the consolidation of both the AI and content production markets.
Is there a world where AI developers could get the training data they need through content licensing deals? Matthew Sag argues that content licensing deals between developers of artificial intelligence and content owners are only possible for large content owners and cannot feasibly apply to the bulk of producers and owners of content on the internet.