Home » Uncategorized

Category Archives: Uncategorized

Consultation Response: Joint EC/EDPB Guidelines on DMA-GDPR Interplay

Over the past years, the EU has been bold enough to globally lead regulation of digital markets and ecosystems. One of the key legal provisions is the Digital Markets Act (DMA). Now, the tricky part for policy makers, most notably the European Commission’s DG CNCT and DG Competitition, is to enforce the laws in a way that their purpose is fulfilled. For the DMA, that goal is to (re)establish contestable markets: “real competition,” you could say, where today we see very dominant big tech firms in most digital sectors, including search engines, social media and other platform markets.

Enforcement gets even more complicated because a row of other regulations already exists, including the General Data Protection Regulation (GDPR). Now the European Data Protection Board and the European Commission have jointly called for consulation responses about how to enforce both regulations in the best way.

My contribution is here.

There I argue that the draft Guidelines provide a valuable and largely convincing framework for a coherent application of the DMA and the GDPR, in particular by clarifying the role of consent under Article 5(2) DMA, the relationship between the DMA’s data portability and access rights and Article 20 GDPR, and the standard of anonymization required under Article 6(11) DMA (EC/EDPB, 2025). At the same time, both the theoretical and empirical literature on data-driven markets and search quality, and the policy experience that motivated the DMA, suggest that the Guidelines should go further in two directions.

First, they should more clearly acknowledge the centrality of user-generated data for search quality and market contestability and therefore treat the DMA’s data-sharing obligations as a structural remedy to data-driven incumbency advantages, rather than as a narrow access right.

Second, they should clarify that the anonymization and safeguard standards under Article 6(11) DMA must be interpreted in a way that preserves the utility of the data necessary for effective competition, avoids over-reliance on techniques such as differential privacy that can unduly degrade data quality, and prevents dominant gatekeepers from using over-cautious interpretations of privacy to frustrate the DMA’s objectives.

Video talk: From Economic Power to Political Power

My recent working paper, “From Economic Power to Political Power” (joint with Ivan Khomyanin) meets significant interest among several audiences as seven scheduled talks within 5 months suggest. Here is a 60-minute video (incl. Q&A) of the presentation at the Centre for Competition Policy at the University of East Anglia.

Chair of Economics, Governance, and Technology at Tilburg University

Since 2022, I have been Professor in Economics at the University of East Anglia’s School of Economics. Now, Tilburg University, my main affiliation since 2023, has drawn level and installed a new Chair in Economics, Governance, and Technology.

The official press release is here.

New Working Paper: From Economic Power to Political Power

In 2024, a highly unusual conflict erupted between Elon Musk, the world’s richest entrepreneur, and Brazil’s Supreme Court, raising profound questions about corporate power versus state sovereignty. The clash began when Musk’s platform X (formerly Twitter) refused to comply with court orders requiring the restriction of accounts spreading harmful content or the appointment of a legal representative in Brazil. Justice Alexandre de Moraes branded Musk an “outlaw” after repeated defiance, leading the Court to impose a nationwide ban on X, freeze Starlink’s assets, and levy heavy fines. Musk escalated matters by encouraging Brazilians to bypass the ban and leveraging Starlink to resist enforcement, an unprecedented instance of an individual challenging a major democracy’s highest legal authority. Though Musk ultimately backed down and complied after weeks of confrontation, the standoff underscored the potential for private actors to openly resist core political institutions of sovereign countries. This case not only highlighted the fragility of enforcing national laws in the digital age but also raised broader concerns: whether Musk’s defiance signals a broader trend of individuals placing themselves above state authority, and what risks such behaviour might pose for democratic governance and global order.

Inspired by these questions, in a new TILEC Discussion Paper, titled “From Economic Power to Political Power”, together with Ivan Khomyanin I construct a series of game-theoretic models. We analyze how co-investment in public infrastructure by private firms can create vulnerabilities for states, especially in digital markets where services can be withdrawn at short notice. The models reveal that a firm’s economic power can translate into political power when governments become dependent on private investment, and that repeated interactions increase the risk of such confrontations. Empirical illustrations—including cases involving Google, Meta, and OpenAI—underscore the growing potential for corporate actors to exert political influence, sometimes even at the expense of profit. The paper concludes with policy recommendations to mitigate the risks of private actors undermining democratic governance. It is also available as CCP Working Paper, including a 2-page policy briefing.

Workshop: “Economic Governance of Social Media”

TILEC will be organizing a policy-oriented, academic workshop on “Economic Governance of Social Media” on 25-26 September 2025. Details and submission guidelines are here and on the workshop’s website. Deadline is 15 June. The background of the workshop follows below.

In 2023, there were 4.76 billion social media users worldwide, comprising 60% of the world population and over 90% of internet users. While social media have created a range of well-documented benefits, both for businesses and for individual users, the list of negative effects for democracy, individuals and society at large is growing: distribution of misinformation and hate speech, manipulation of individuals’ beliefs and behavior through news selection, the sheer amount of time spent on social media platforms, keeping people away from being productive, and negative trends in mental health, especially amongst children and young adults, cast a long, dark shadow over social media’s net effects.

Political scientist Lars Rensmann, asked about reasons for the recent upswing of radical right parties, summarized: ”Not populists like Trump but social media are the biggest threat.” The documented growing discontent of many voters with the political establishment (and more and more: the system of liberal democracy with rule of law and political checks and balances on those in power) has thrived on social media. Over the past decade, we have seen how social media has contributed to an extremely polarized society. Especially young people depend very much on TikTok in their news diet, where radical right-wingers successfully proclaim their message and facts matter little. Adults, too, often consume a large part of their news via social media or news aggregator sites. As Rensmann puts it: “Democracies work only when politics is based on facts. As long as that is not the case and people are shaped by propaganda, democracies are doomed.” The Financial Times’ editorial board proclaimed: “Europe’s democratic values are so fundamental that its leaders should not shy from enforcing rules designed to protect them — even if that risks clashing with the X or Meta bosses, or the returning US president.”

For Rensmann, the example of Elon Musk is a point of crystallization: “He has 203 million followers, he owns X, he spreads hate and disinformation, and he influences elections. Someone needs to stand up and not just say that social media is worrisome, but actually do something about the power of big business.”
This is the starting point for this workshop on “Economic Governance of Social Media.” While we understand that a service used by 60% of the world’s population must deliver some benefits, we take it as a working hypothesis that social media’s negative effects must be contained. The question is how, and by whom?

Such questions are the premise of the field of economic governance, which studies the structure and functioning of the legal and social institutions that support economic activity and economic transactions by protecting property rights, enforcing contracts, and taking collective action to provide physical and organizational infrastructure (Dixit, 2009, p.5). Economic governance is a broad concept that hosts both public-ordering institutions (governance by state authorities), private-ordering institutions (governance by formal or informal non-state actors), and hybrid forms. It tries to identify the optimal institutional setup, i.e., the optimal allocation of control rights over the design, adjudication, and enforcement of rules in any given socioeconomic environment.

The Tilburg Law and Economics Center (TILEC) has organized six economic governance workshops, which focused on the role of competition (in 2010), organizations (in 2013), social preferences (in 2015), data-driven markets (in 2017), the governance of big data and AI (in 2019), and political legitimacy (in 2022), respectively. Now, we strive to stimulate the debate how the negative effects of social media could be contained, both conceptually, practically, legally, and technically. During a multidisciplinary, discussion-intensive, deeply theoretical and policy-oriented two-day workshop in September 2025, we aim to learn from theoretical, empirical, experimental, and conceptual papers addressing the main question from various angles.

Specific topics of interest, organizational details, and submission guidelines are in the call for papers and on the workshop’s website.

“How important are user-generated data for search engine quality? Experimental results” published in Journal of Law & Economics

Online search engines are used by billions of users every day. They offer the basic infrastructure for many other industries and are, therefore, of very high economic, political, and social importance. Over the past few years, an intense policy debate has formed around the question: do some search engines produce better search results because their algorithm is better, or because they have access to more data from past searches?

In the former case, it may be best to refrain from interventions in the market in order not to stifle the innovation incentives of successful entrepreneurs (and their potential contestants). In the latter case, mandatory data sharing of user-generated data, a policy that is currently discussed and already contained in the EU’s Digital Markets Act, could trigger innovation and would benefit all users of search engines.

Together with Tobias Klein, Madina Kurmangaliyeva, and Patricia Prüfer, and asked by the German Finance Ministry, I engaged on a journey to produce relevant data and inform the policy-making process. The resulting paper, “How important are user-generated data for search engine quality? Experimental results”, has now been accepted for publication by the Journal of Law & Economics.

In this paper, we report results from a collaboration with a small search engine, Cliqz. They provided us with non-personalized search results for a random set of queries and conducted an experiment on our behalf. This offers within-search engine comparisons. We complemented the Cliqz data with non-personalized search results from Google and Bing on the same queries in the same period in the same country and asked external assessors to assess the quality of the search results on a 7-point Likert scale (not mentioning the origin of the results). This offers insights about between-search engine comparisons.

We find robust evidence that differences in the quality of search results are explained by searches for less popular search terms, for which a search algorithm can rely on less data.  The insights are complemented by results from an experiment, in which we keep the algorithm of the search engine fixed and vary the amount of data it uses as an input. This offers causal evidence that more user data on rare queries enables search engines to produce better quality. Notably, 74% of the traffic in our data come from “rare queries”. Hence, this is the relevant dimension of competition, where a search engine must perform in order to attract users.

Our results show that the mandatory sharing of user data may be an appropriate remedy in the sense that it would likely allow entrants, such as Cliqz, to successfully compete with the incumbent (Google) by enabling Cliqz to provide search results that are also of high quality for rare queries. Unlike in other contexts, this remedy does not directly harm the incumbent, as it makes use of the non-rivalry of information: the incumbent will still be able to use the same data. Only the exclusivity of data access would be reduced. Consequently, users would benefit.

Economic Governance and Institutional Design

What are institutions, and how should they be designed to achieve compliance with behavioral rules, including laws, social norms, religious rules, or cultural traditions? In a new conceptual paper, “Economic Governance and Institutional Design” (TILEC Discussion Paper 24-10), I introduce a typology of economic governance institutions and explain how it can be used both by policy makers, administrators, and researchers in law and economics to improve rule compliance. The paper explains how effective and efficient institutions can be identified for a given economic governance problem. The concepts are applied to two cases: how to create trust in cloud computing technologies, and how to implement data sharing of user-generated information on data-driven markets?

Explaining the economic governance methodology bottom-up and comprehensively and making it properly citeable is a novelty and the key contribution of this paper. The nice part is that it evolved both out of earlier research and out of 12 years of teaching. Therefore, I intellectually owe both several co-authors, especially Scott Masten, and many graduate students who gave feedback.

A 2-page policy brief is here.

“Using Data Science in Competition Enforcement and Platform Regulation” forthcoming in European Competition Journal

Competition authorities in Europe and beyond have started to rely on data science to monitor markets and to check compliance with the applicable rules. The Digital Markets Act and the Digital Services Act have made the use of data science among regulators even more relevant considering the scale and complexity of the monitoring that is required.

In work done for the expert group to the EU Observatory on the Online Platform Economy, “Charting a Way Forward for the Use of Data Science in Competition Enforcement and Platform Regulation” (joint with Inge Graef and Ulrich Laitenberger), we study the range of data science tools that is already available to competition authorities and other regulators. We reflect on the promises and challenges of the future uptake of data science tools and discuss how data science expertise can be integrated into regulatory agencies. The existing use of data science shows that regulators are capable of adjusting their organizations and processes to reap the potential of employing technology tools in their activities. At the same time, the future uptake of data science comes with challenges relating to the reliability of data and the privacy of individuals. Beyond involvement in specific investigations, data science also carries the potential of reforming the work of regulators by moving towards a more proactive form of enforcement. Exchanging data science expertise and tools across regulators deserves to be further facilitated to increase collaboration and share resources.

The paper will be published soon in the European Competition Journal.

Postdoctoral Researcher (Political Economy, AI, Autocracy) at Tilburg University

The Tilburg Law and Economics Center at Tilburg University, the Netherlands, is looking for a Postdoctoral Researcher to work, in a team, on “Artificial Intelligence in Autocratic Countries.” The goal of the project is to produce both fundamental research on the political economy of AI-development in autocratic countries (esp. China and Russia) and applicable advice for European policy makers.

Candidates should have a PhD in a social science, good empirical skills and experience in working with data (econometrics, statistics, and/or data science). The specific discipline of the PhD does not matter but political scientists, economists, economic geographers, or area specialists of China and/or Russia (or a related autocratic country and governance institutions) may have an advantage. Applicants should also have knowledge and interest in (i) politics, international relations, or political economy and also in (ii) digital industries, technology innovation, or AI-development. Specific knowledge of the countries, institutions, and languages of China or Russia is a plus but not mandatory.

Application deadline is 6 October 2024.

All details of the job and how to apply are here.

EU Horizon project AI4POL granted

The European Commission has just granted a large research project on “Using Artificial Intelligence to Support Regulators and Policy Makers” (AI4POL). The project is coordinated at Tilburg University (led by TILEC). Other consortium members comprise TU Munich, the University of East Anglia, Visionary Analytics (Lithuania), Centerdata (NL), and University Sapienza in Rome. Here is the plan, which will start in early 2025:

AI-innovation in Europe lags behind the US and China. To catch up and ensure pro-European outcomes, AI-regulation constitutes the EU’s main channel to shape the future of AI-development globally. This project will support European regulators and policy makers with knowledge and tools to adequately address challenges and opportunities of trustworthy and ethical AI and to develop and enforce effective regulation of AI based on human rights, European values, and citizens’ needs. AI4POL will explore how regulators can use data science and AI-driven tools to improve monitoring and enforcement of regulations such as the Data Act, the Digital Markets Act, the AI Act and in consumer law. To this end, we will focus on a key innovation area, financial services, and how to increase AI-enhanced understanding and citizen feedback for the informed regulation of digital services by developing a large language model for legal jargon translation and a browser-plugin for user feedback about their understanding of laws. We will also analyze how to create and regulate trustworthy AI for financial services, for example with robo-advisors or credit scoring. Taking a long-term, geopolitical perspective, we will develop an early-warning system regarding high-risk AI in autocratic states, including the development of an AI Threat Index and dashboard and piloted with the cases of China and Russia.

AI4POL pursues these objectives with a multidisciplinary, diverse research team, combining substantive expertise in AI/data science, ethics, law, economics, and political science with project management resilience, quality assurance and timeline monitoring, and risk-based intervention plans. The consortium has extensive experience in advising policy makers and has reliable contacts to various stakeholders, evidenced by AI4POL’s Advisory Board comprising EU- and national regulators and policy makers, consumer protection agencies, civil society organizations, and AI-firms.