A Special Series within the ‘Theories of Regulation and Governance‘ Webinar Program, hosted by Dr. Nir Kosti and Professor David Levi-Faur. Visit our Webinars page and subscribe to our YouTube channel.

View our special Series:


Regulatory Stringency and Sectoral Platform Regulation in EU Cities

Eliška Drápalová and Kai Wegrich, Regulatory Stringency and Sectoral Platform Regulation in EU Cities, Tuesday, January 13th 2026, 14.00 CET; 13.00 London Time; 15.00 Jerusalem Time.

This Lecture investigates the evolving responses of local regulators to platform companies across 108 European cities between 2012 and 2022. Platform companies like Uber and Airbnb are depicted as agile policy entrepreneurs who can navigate the boundaries of regulatory frameworks and manipulate regulations to their advantage; however, recent empirical studies suggest that their capacity to influence policy depends on the particular political and institutional context. Utilizing a regulatory stringency index, the presentation will assess how city governments have adjusted their regulations to counteract the growing influence of these platforms. The research indicates a notable trend toward stricter regulatory measures, reflecting an increased determination among local authorities to enforce rules on platforms. The webinar will identify various patterns of regulatory change shaped by sector-specific dynamics and differing national and local contexts. By examining regulatory dynamics over time, this presentation contributes to the understanding of the relationship between platform power and local governance.

Eliška Drápalová is a postdoctoral research fellow at the WZB Berlin Social Science Center within the Politics of Digitalisation research group. She co-directs a DFG-funded project on regulating platform companies in EU cities and regions.

Kai Wegrich is Professor of Public Administration and Public Policy at the Hertie School. His research focuses on executive politics, regulation, public sector reform, and innovations in policy-making.


LEGDAT: A Global Infrastructure for the Study of Legislative Process and Lawmaking

Mihály Fazekas and Cyril Benoît, LEGDAT: A Global Infrastructure for the Study of Legislative Process and Lawmaking, Friday, November 21st 2025, 13.00 CET; 12.00 London Time; 14.00 Jerusalem Time.

Legislatures are central to policymaking, yet comparative research has long lacked systematic data on how legislative processes unfold across countries. This paper introduces LEGDAT, a new dataset that captures the advancement of bills and laws in 16 countries, comprising over 350,000 bills and 60,000 laws. LEGDAT standardizes information on bill initiation, committee referral, legislative stages, final votes, passage or rejection, and post-enactment modifications. The dataset enables analysis of both the procedural dynamics of lawmaking and the stability of legislation over time. We validate the data against existing projects and demonstrate its breadth through descriptive statistics and cross-country comparisons. By combining temporal depth, geographic coverage, and process-level detail, LEGDAT establishes a new foundation for empirical research on legislative organization, political dynamics, and democratic responsiveness. Regular updates will ensure its continued relevance.

Mihály Fazekas is Professor in the Department of Public Policy at Central European University and Scientific Director of the Government Transparency Institute.

Cyril Benoît is a CNRS Researcher at Sciences Po, affiliated with the Centre for European Studies and Comparative Politics. His research examines how political institutions shape economic outcomes, with a focus on social policy and legislative processes.


The Policy Content of Statutes: The Case of U.S. State Legislatures, 1880-2020

Elliott Ash is Associate Professor of Law, Economics, and Data Science at ETH Zurich’s Center for Law & Economics. His research applies computational methods to analyze legal and policy texts.

Elliott Ash, The Policy Content of Statutes: The Case of U.S. State Legislatures, 1880-2020, Tuesday, April 22nd 2025, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

This webinar presents a novel methodology for analyzing the adoption and diffusion of legislation enacted by U.S. states and the federal government from 1880 to 2020. Using large language models, Elliott Ash and his team extract and structure the policy content from 2.5 million state statutes and 80,000 federal acts. The presentation will demonstrate how AI-powered policy extraction, combined with unsupervised clustering algorithms applied to vector representations of policy content, enables systematic measurement of policy similarity, diffusion, and innovation across states over a 140-year period. The webinar will explore key questions about the determinants of policy adoption: How much do geography, economic similarity, and political partisanship influence which policies states adopt? How has federal legislation shaped state policymaking over time? The findings reveal that states with greater geographic, economic, and political similarity implement more similar policies, and while polarization has risen significantly since the 1990s, current levels are comparable to those observed in the pre-WWII period. The presentation will also address the methodological challenges and validation strategies involved in using AI for large-scale policy analysis, demonstrating both the opportunities and limitations of these emerging computational methods for studying legislative texts.


Sentiment and Uncertainty about Regulation

Zhoudan (Zoey) Xie is a senior policy analyst at the GW Regulatory Studies Center and a Ph.D. candidate in economics at George Washington University. Her research focuses on the economic effects of regulation using natural language processing techniques and text as data.

Zhoudan (Zoey) Xie, Sentiment and Uncertainty about Regulation, Faburary 27th 2025. 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

We present text-based measures of sentiment and uncertainty about the U.S. regulatory environment from 1985 to 2021. Using natural language processing techniques, we identify an original news corpus from seven leading U.S. newspapers and quantify the average sentiment and the degree of uncertainty expressed about regulation. Our analysis produces monthly indexes of aggregate regulatory sentiment and regulatory uncertainty, as well as categorical indexes for 14 regulatory policy areas. These measures reflect public perceptions of regulation and can be used to examine their potential impact on economic activity. The presentation is based on joint work with Tara M. Sinclair.


Policy Complexity: Measurement, Origins and Consequences

Steffen Hurka is Chair of European Politics at Zeppelin University, where he leads the ERC-funded DEMOLAW project (2024–2029) and has been directing the Emmy Noether Junior Research Group EUPLEX since 2019. His research focuses on EU legislative processes, the European Parliament’s organization, and democratic legislation.

Steffen Hurka, Policy Complexity: Measurement, Origins and Consequences, Thursday, January 30th 2025. 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time.

What makes a law complex, how can we measure this complexity and why does complexity vary so much across individual laws, over time, and between and within policy areas? How does the complexity of laws affect political institutions and to what extent is implementation affected by increasing complexity? Based on data from his research project EUPLEX, Steffen Hurka will shed light on these and other questions from an empirical perspective, focusing on the political system of the European Union (EU). Furthermore, Steffen Hurka will also briefly introduce his new ERC project DEMOLAW (The design, creation and survival of democratic laws), in which he will assess legislative designs in different policy domains in a cross-temporal and cross-national perspective using computational methods of text analysis.


The Regulators’ Budget: Tracking Regulation Through Government Outlays and Employment

Susan E. Dudley and Sarah Hay, The Regulators’ Budget: Tracking Regulation Through Government Outlays and Employment, Monday, December 17th 2024, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

The “Regulators’ Budget” provides two of the proxy measures often used to get a sense of the growth and size of U.S. regulation – the fiscal outlays and the personnel devoted to administering federal regulatory agencies. Analyzing the federal personnel and expenditures necessary to develop and enforce regulations provides a way to track regulatory trends. These data offer a proxy for the scope of regulatory activity and costs, and provide insights into the composition and evolution of regulation over time. By including resources devoted to both developing new regulations and supporting and enforcing existing regulations, the Regulators’ Budget attempts to be a comprehensive account of the U.S. government resources involved in maintaining the regulatory state.

Susan E. Dudley is the founder of the George Washington University Regulatory Studies Center, which works to improve regulatory policy through research, education, and outreach. She is a senior scholar with the Center and a distinguished professor of practice in the Trachtenberg School of Public Policy and Public Administration.

Sarah Hay is a policy analyst in the George Washington University Regulatory Studies Center. She studies the Congressional Review Act (CRA) and public participation in the regulatory process.


Measuring Regulatory Complexity

Jean-Edouard Colliard, Measuring Regulatory Complexity, Tuesday, December 10th 2024, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

We propose a framework to study regulatory complexity, based on concepts from computer science. We distinguish different dimensions of complexity, classify existing measures, develop new ones, and compute them on three examples—Basel I, the Dodd-Frank Act, and the European Banking Authority’s reporting rules—and test them using experiments and a survey on compliance costs. We highlight two measures that capture complexity beyond the length of a regulation. We offer a quantitative approach to the policy trade-off between regulatory complexity and precision. Our toolkit is freely available and allows researchers to work on other texts and test alternative measures.

Jean-Edouard Colliard is an Associate Professor of Finance at HEC Paris. He obtained a PhD in Economics in 2012 from the Paris School of Economics.

Co-Pierre Georg is the Director of the Frankfurt School Blockchain Center and a Professor of Practice in Financial Technology at the Frankfurt School of Finance & Management.


Generative Regulatory Measurement: The Case of U.S Housing Regulation

Arpit Gupta is Assistant Professor of Finance at New York University Stern School of Business since September 2016. Professor Gupta’s research interests focus on using large datasets to understand default dynamics in household finance, real estate and corporate finance.

Arpit Gupta, Generative Regulatory Measurement: The Case of U.S Housing Regulation, Monday, December 2nd 2024, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

We present a novel method called “generative regulatory measurement” that uses Large Language Models (LLMs) to interpret statutes and administrative documents. Our paper demonstrates the tool’s effectiveness in analyzing U.S. municipal zoning codes, achieving 96% accuracy in binary classification tasks and a 0.92 correlation in predicting minimum lot sizes. The results establish five key findings about American zoning:

  1. Housing production occurs disproportionately in unincorporated areas without municipal zoning codes
  2. Density, in the form of multifamily apartments and small-lot single-family homes, is broadly restricted
  3. Zoning follows a monocentric pattern with regional variations, with particularly strict suburban regulations in the Northeast
  4. Housing regulations cluster into two main principal components:
    • The first corresponds to housing complexity and can be interpreted as value extraction in high-demand environments
    • The second associates with exclusionary zoning

Collecting Public Data to Study the U.S Regulatory Process

Alex Acs is an Associate Professor of Political Science at The Ohio State University, where he conducts research on executive branch policymaking, regulatory politics, and American political institutions.

Alex Acs, Collecting public Data to Study the U.S Regulatory Process, Thursday, November 21th 2024, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

The purpose of this presentation is to provide an overview of the U.S. regulatory process and the public data it generates. The presentation will cover: the publication of proposed and final rules in the Federal Register; the publication of regulatory plans in the Unified Agenda of Federal Regulatory and Deregulatory Actions; the publication of rules reviewed by the Office of Information and Regulatory Affairs (OIRA); and the publication of public comments submitted in response to proposed rules. We will discuss technical aspects of data collection and processing, as well as how these data can inform research in political science and public policy.


Toward a Regulatory Grammer

Saba Siddiki and Christopher Frantz, Toward a Regulatory Grammer, Thuesday, November 19th 2024, 15.00 CET; 16.00 London Time; 18.00 Jerusalem Time; 11.00 Eastern Time.

The Institutional Grammar (IG), originally devised by Crawford and Ostrom, is an increasingly developed and adopted paradigm for institutional studies. The IG relies on the identification and analysis of institutional statements (strategies, norms, rules) that compose institutional arrangements. While broad in applicability, covering both formal and informal institutions, one of its most rapidly progressing applications is in regulatory analysis. Drawing on the most recent advances in the Institutional Grammar (IG 2.0), we argue the benefits of this “grammar” as an analytical paradigm for regulatory studies based on: a) its ability to provide a unified unit of analysis, b) its capacity to analyze regulation at different scales, and c) its ability to integrate with case-specific theories of interest, among other analytical affordances.

This opens a range of novel applications in the burgeoning field of computational institutional science, including the quantification of institutional complexity and novel forms of network analytics. Following the presentation of the intellectual and conceptual foundations, we highlight features of distinctive value for regulatory studies before turning to recent developments in computational approaches within the wider IG community, focusing on the automated extraction and analysis of regulatory information at scale. We discuss the merits of such approaches and explore how they contribute to the increasingly rich portfolio of IG-centric techniques for studying regulation.

Saba Siddiki is Director of the Center for Policy Design and Governance and also the Chapple Family Professor of Citizenship and Democracy in the Maxwell School of Citizenship and Public Affairs, Syracuse University.

Christopher Frantz is Associate Professor at the Department of Computer Science at the Norwegian University of Science and Technology.


Introducing the RegData Project

Dr. Patrick A. McLaughlin is the Director of Policy Analytics and a Senior Research Fellow at the Mercatus Center at George Mason University.

Patrick A. McLaughlin, Introducting the RegData Project, Monday, November 11th 2024, 15.00 CET; 16.00 London Time; 18.00 Jerusalem Time; 11.00 Eastern Time.

The RegData Project, initiated by Patrick McLaughlin in 2012, emerged as a solution to the longstanding challenge of conducting empirical analysis on regulatory effects due to data limitations. As both a methodology and database, RegData employs sophisticated text analysis and machine-learning algorithms to quantify multiple dimensions of regulation, including volume, restrictiveness, complexity, and sectoral relevance. The project’s flagship component, RegData U.S., maps federal regulations to affected economic sectors using the North American Industry Classification System (NAICS), enabling unprecedented research into regulatory causes and effects. Its methodology identifies regulatory restrictions through specific word and phrase counting, while also tracking metrics such as word count, complexity, and agency attribution within the Code of Federal Regulations. Built on the open-source QuantGov policy analytics platform, RegData’s framework has expanded beyond U.S. federal regulations to encompass state-level analysis (State RegData) and international jurisdictions, including Australia and Canada. This comprehensive approach to regulatory quantification has transformed our ability to analyze and understand the regulatory landscape, with all data publicly accessible through https://www.reghub.ai/data.


Counting Regulations and Measuring Regulatory Impact: A Call for Nuance

Stuart Shapiro is Dean of the Edward J. Bloustein School of Planning and Public Policy at Rutgers University.

Stuart Shapiro, Counting Regulations and Measuring Regulatory Impact: A Call for Nuance, Monday, November 4th 2024, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

The effect of regulation on virtually every aspect of the lives of US citizens has led to an understandable impulse to measure this total impact. It has led to various attempts to count the total number of regulations and regulatory requirements, and to total the costs and benefits of regulation. These counting mechanisms have played prominent roles in discussions over statutory changes designed to reform the process by which we write regulations. But counting regulations in a meaningful way and measuring their cumulative economic impact is an astonishingly difficult task. Various methods have been employed by scholars and advocates in this effort. This article is an attempt to catalog the most prominent methods of counting regulations and measuring regulatory impact in the United States, describe their strengths and weaknesses, and to suggest alternative approaches to attack this important question.  We suggest both using large language models and detailed analysis of Paperwork Reduction Act data and, at the opposite extreme, doing more qualitative work on the consequences of regulation on individuals, firms, and industries.


The Visualization of Policy Portfolios: Challenges, Opportunities and Applications

Xavier Fernández-i-Marín is a “Ramón y Cajal” fellow at the Universitat de Barcelona.

Xavier Fernández-i-Marín, The Visualization of Policy Portfolios: Challenges, Opportunities and Applications, Thursday, October 9th 2024, 13.00 CET; 12.00 London Time; 14.00 Jerusalem Time; 7.00 Eastern Time.

This webinar explores the possibilities of a visualization of “policy portfolios” for political science and general and comparative public policy in particular. There are different types of ideas that a policy portfolio can describe: policy accumulation, complexity/ diversity, convergence and/or style. All this can be studied comparatively across countries, over time, in different policy levels (EU, national, subnational and local). This webinar cover examples ranging from social and environmental policies to climate change policies and even Artificial Intelligence policies, showing the potential of the tool. It also provides some gentle introduction to the R package “PolicyPortfolios”.