A Special Series within the ‘Theories of Regulation and Governance‘ Webinar Program, hosted by Dr. Nir Kosti and Professor David Levi-Faur. Visit our Webinars page and subscribe to our YouTube channel.

View our special Series:


The Regulators’ Budget: Tracking Regulation Through Government Outlays and Employment

Susan E. Dudley and Sarah Hay, The Regulators’ Budget: Tracking Regulation Through Government Outlays and Employment, Monday, December 17th 2024, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

The “Regulators’ Budget” provides two of the proxy measures often used to get a sense of the growth and size of U.S. regulation – the fiscal outlays and the personnel devoted to administering federal regulatory agencies. Analyzing the federal personnel and expenditures necessary to develop and enforce regulations provides a way to track regulatory trends. These data offer a proxy for the scope of regulatory activity and costs, and provide insights into the composition and evolution of regulation over time. By including resources devoted to both developing new regulations and supporting and enforcing existing regulations, the Regulators’ Budget attempts to be a comprehensive account of the U.S. government resources involved in maintaining the regulatory state.

Susan E. Dudley is the founder of the George Washington University Regulatory Studies Center, which works to improve regulatory policy through research, education, and outreach. She is a senior scholar with the Center and a distinguished professor of practice in the Trachtenberg School of Public Policy and Public Administration.

Sarah Hay is a policy analyst in the George Washington University Regulatory Studies Center. She studies the Congressional Review Act (CRA) and public participation in the regulatory process.


Measuring Regulatory Complexity

Jean-Edouard Colliard, Measuring Regulatory Complexity, Tuesday, December 10th 2024, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

We propose a framework to study regulatory complexity, based on concepts from computer science. We distinguish different dimensions of complexity, classify existing measures, develop new ones, and compute them on three examples—Basel I, the Dodd-Frank Act, and the European Banking Authority’s reporting rules—and test them using experiments and a survey on compliance costs. We highlight two measures that capture complexity beyond the length of a regulation. We offer a quantitative approach to the policy trade-off between regulatory complexity and precision. Our toolkit is freely available and allows researchers to work on other texts and test alternative measures.

Jean-Edouard Colliard is an Associate Professor of Finance at HEC Paris. He obtained a PhD in Economics in 2012 from the Paris School of Economics.

Co-Pierre Georg is the Director of the Frankfurt School Blockchain Center and a Professor of Practice in Financial Technology at the Frankfurt School of Finance & Management.


Generative Regulatory Measurement: The Case of U.S Housing Regulation

Arpit Gupta is Assistant Professor of Finance at New York University Stern School of Business since September 2016. Professor Gupta’s research interests focus on using large datasets to understand default dynamics in household finance, real estate and corporate finance.

Arpit Gupta, Generative Regulatory Measurement: The Case of U.S Housing Regulation, Monday, December 2nd 2024, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

We present a novel method called “generative regulatory measurement” that uses Large Language Models (LLMs) to interpret statutes and administrative documents. Our paper demonstrates the tool’s effectiveness in analyzing U.S. municipal zoning codes, achieving 96% accuracy in binary classification tasks and a 0.92 correlation in predicting minimum lot sizes. The results establish five key findings about American zoning:

  1. Housing production occurs disproportionately in unincorporated areas without municipal zoning codes
  2. Density, in the form of multifamily apartments and small-lot single-family homes, is broadly restricted
  3. Zoning follows a monocentric pattern with regional variations, with particularly strict suburban regulations in the Northeast
  4. Housing regulations cluster into two main principal components:
    • The first corresponds to housing complexity and can be interpreted as value extraction in high-demand environments
    • The second associates with exclusionary zoning

Collecting Public Data to Study the U.S Regulatory Process

Alex Acs is an Associate Professor of Political Science at The Ohio State University, where he conducts research on executive branch policymaking, regulatory politics, and American political institutions.

Alex Acs, Collecting public Data to Study the U.S Regulatory Process, Thursday, November 21th 2024, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

The purpose of this presentation is to provide an overview of the U.S. regulatory process and the public data it generates. The presentation will cover: the publication of proposed and final rules in the Federal Register; the publication of regulatory plans in the Unified Agenda of Federal Regulatory and Deregulatory Actions; the publication of rules reviewed by the Office of Information and Regulatory Affairs (OIRA); and the publication of public comments submitted in response to proposed rules. We will discuss technical aspects of data collection and processing, as well as how these data can inform research in political science and public policy.


Toward a Regulatory Grammer

Saba Siddiki and Christopher Frantz, Toward a Regulatory Grammer, Thuesday, November 19th 2024, 15.00 CET; 16.00 London Time; 18.00 Jerusalem Time; 11.00 Eastern Time.

The Institutional Grammar (IG), originally devised by Crawford and Ostrom, is an increasingly developed and adopted paradigm for institutional studies. The IG relies on the identification and analysis of institutional statements (strategies, norms, rules) that compose institutional arrangements. While broad in applicability, covering both formal and informal institutions, one of its most rapidly progressing applications is in regulatory analysis. Drawing on the most recent advances in the Institutional Grammar (IG 2.0), we argue the benefits of this “grammar” as an analytical paradigm for regulatory studies based on: a) its ability to provide a unified unit of analysis, b) its capacity to analyze regulation at different scales, and c) its ability to integrate with case-specific theories of interest, among other analytical affordances.

This opens a range of novel applications in the burgeoning field of computational institutional science, including the quantification of institutional complexity and novel forms of network analytics. Following the presentation of the intellectual and conceptual foundations, we highlight features of distinctive value for regulatory studies before turning to recent developments in computational approaches within the wider IG community, focusing on the automated extraction and analysis of regulatory information at scale. We discuss the merits of such approaches and explore how they contribute to the increasingly rich portfolio of IG-centric techniques for studying regulation.

Saba Siddiki is Director of the Center for Policy Design and Governance and also the Chapple Family Professor of Citizenship and Democracy in the Maxwell School of Citizenship and Public Affairs, Syracuse University.

Christopher Frantz is Associate Professor at the Department of Computer Science at the Norwegian University of Science and Technology.


Introducing the RegData Project

Dr. Patrick A. McLaughlin is the Director of Policy Analytics and a Senior Research Fellow at the Mercatus Center at George Mason University.

Patrick A. McLaughlin, Introducting the RegData Project, Monday, November 11th 2024, 15.00 CET; 16.00 London Time; 18.00 Jerusalem Time; 11.00 Eastern Time.

The RegData Project, initiated by Patrick McLaughlin in 2012, emerged as a solution to the longstanding challenge of conducting empirical analysis on regulatory effects due to data limitations. As both a methodology and database, RegData employs sophisticated text analysis and machine-learning algorithms to quantify multiple dimensions of regulation, including volume, restrictiveness, complexity, and sectoral relevance. The project’s flagship component, RegData U.S., maps federal regulations to affected economic sectors using the North American Industry Classification System (NAICS), enabling unprecedented research into regulatory causes and effects. Its methodology identifies regulatory restrictions through specific word and phrase counting, while also tracking metrics such as word count, complexity, and agency attribution within the Code of Federal Regulations. Built on the open-source QuantGov policy analytics platform, RegData’s framework has expanded beyond U.S. federal regulations to encompass state-level analysis (State RegData) and international jurisdictions, including Australia and Canada. This comprehensive approach to regulatory quantification has transformed our ability to analyze and understand the regulatory landscape, with all data publicly accessible through https://www.reghub.ai/data.


Counting Regulations and Measuring Regulatory Impact: A Call for Nuance

Stuart Shapiro is Dean of the Edward J. Bloustein School of Planning and Public Policy at Rutgers University.

Stuart Shapiro, Counting Regulations and Measuring Regulatory Impact: A Call for Nuance, Monday, November 4th 2024, 15.00 CET; 14.00 London Time; 16.00 Jerusalem Time; 9.00 Eastern Time.

The effect of regulation on virtually every aspect of the lives of US citizens has led to an understandable impulse to measure this total impact. It has led to various attempts to count the total number of regulations and regulatory requirements, and to total the costs and benefits of regulation. These counting mechanisms have played prominent roles in discussions over statutory changes designed to reform the process by which we write regulations. But counting regulations in a meaningful way and measuring their cumulative economic impact is an astonishingly difficult task. Various methods have been employed by scholars and advocates in this effort. This article is an attempt to catalog the most prominent methods of counting regulations and measuring regulatory impact in the United States, describe their strengths and weaknesses, and to suggest alternative approaches to attack this important question.  We suggest both using large language models and detailed analysis of Paperwork Reduction Act data and, at the opposite extreme, doing more qualitative work on the consequences of regulation on individuals, firms, and industries.


The Visualization of Policy Portfolios: Challenges, Opportunities and Applications

Xavier Fernández-i-Marín is a “Ramón y Cajal” fellow at the Universitat de Barcelona.

Xavier Fernández-i-Marín, The Visualization of Policy Portfolios: Challenges, Opportunities and Applications, Thursday, October 9th 2024, 13.00 CET; 12.00 London Time; 14.00 Jerusalem Time; 7.00 Eastern Time.

This webinar explores the possibilities of a visualization of “policy portfolios” for political science and general and comparative public policy in particular. There are different types of ideas that a policy portfolio can describe: policy accumulation, complexity/ diversity, convergence and/or style. All this can be studied comparatively across countries, over time, in different policy levels (EU, national, subnational and local). This webinar cover examples ranging from social and environmental policies to climate change policies and even Artificial Intelligence policies, showing the potential of the tool. It also provides some gentle introduction to the R package “PolicyPortfolios”.