Published Date: 11 March 2024

“Modelling Approaches at Central Banks” - Opening Remarks by Mr Edward S. Robinson, Deputy Managing Director (Economic Policy) & Chief Economist, Monetary Authority of Singapore, at the 2024 Advanced Workshop for Central Bankers organised by National University of Singapore on 11 March 2024


1   A warm welcome to the 2024 Advanced Workshop for Central Bankers. This is the first edition of the workshop held outside of Northwestern University since its inception in 2004, and I am grateful to our colleagues from the Center for International Macroeconomics, as well as the National University of Singapore (NUS) Risk Management Institute (RMI) for making this event possible. The NUS RMI was established in 2006 with initial support from the Monetary Authority of Singapore (MAS), to serve as a centre of expertise on financial risk for researchers, policymakers and educators. Since then, it has made significant contributions to our understanding of credit risk in the region and beyond. For example, NUS RMI has helped to develop innovative tools for monitoring credit risk and financial vulnerability at the firm-level that have been valuable resources for academics and policymakers, including at the MAS. I am confident that the NUS RMI will go from strength to strength under Professor Chen’s leadership.

2   The principal objective of this workshop is to update participants on models and techniques that form the frontier of research in monetary economics, and yet are realistic and tractable for policy analysis by central bankers. Few are more qualified to speak on the subject than the primary instructors for this course—Professors Martin Eichenbaum and Sergio Rebelo—and it is our absolute pleasure and privilege to have them with us in Singapore this week. Their pioneering work on the New Keynesian (NK) framework would be familiar to all of us, given that its basic structure underpins many of the macroeconomic models in use across central banks and international financial institutions today.

The pervasive influence of the New Keynesian framework

3   The broad and enduring relevance of the NK framework in part reflects its success at bringing together the two major traditions of macroeconomic modelling. It contains the key features of Keynesian economics, which emphasises that market clearing is not necessarily the assured outcome in the presence of nominal rigidities and imperfect information. At the same time, it onboards the most useful aspects of the “new-classical” or Lucasian approach, which include its more stringent standards for the construction of theories of economic behaviour and the incorporation of expectations as a fundamental determinant of the stability of macroeconomic relationships. As a commentator on macroeconomic thought put it, researchers have found it a “good strategy to bend the Real Business Cycle (RBC) methodology towards Keynesian themes”.See De Vroey (2016). The NK framework’s integration of these ideas in a tractable way makes these models particularly well-suited to analysing monetary policy and business cycles. Importantly for economic policymakers, models based on the NK framework provide quantitative guidance on how policy interventions can improve economic welfare in the presence of market imperfections.

4   But perhaps the quality that has contributed most to the enduring relevance of the NK framework in academic and policy settings is its flexibility—the basic model can be readily extended in many ways. Indeed, NK models have come a long way since the pioneering work of Christiano, Eichenbaum and EvansReference to the seminal paper by Christiano, Eichenbaum and Evans (2005) on the ability of NK models to generate plausible output and inflation responses to monetary policy shocks in the presence of nominal rigidities., as subsequent research incorporated modifications that enhanced the framework’s realism in a wide variety of policy applications. After the Global Financial Crisis (GFC), much attention was paid to incorporating macro-financial linkages to the NK framework, and the resulting body of work by academics and policy institutions has significantly helped to deepen our understanding of how financial frictions and crises emerge, and what policymakers can do to prevent them.See, e.g., Christiano, Eichenbaum and Trabandt (2015) and Gertler and Kiyotaki (2015).

5   The prompt integration of lessons learned from the GFC into the workhorse NK model is a good example of forward progress in macroeconomic modelling. To recall another oft-cited truism: “All models are wrong, but some are useful”. Even the most cogent model for its time will require updating as the world around it changes. As such, NK models have been further developed in many dimensions to include such features as unconventional policy instruments at the zero-lower boundSee, e.g., Aruoba, Cuba-Borda and Schorfheide (2018)., epidemic dynamicsSee Eichenbaum, Rebelo and Trabandt (2020)., and heterogeneous consumers and firms that can be used to study the distributional implications of monetary policy.Christiano, Eichenbaum and Trabandt (2018) and Galí (2018) provide reviews of research on incorporating heterogeneity into NK models. At the same time, the framework has managed to retain its best qualities—the theoretical rigour that links it to the micro foundations of macroeconomics, along with its tractability.

AI at central banks: a new paradigm?

6   Over the past year, central banks have had to answer difficult questions about our collective failure to foresee the persistence of inflation after the pandemic, which has in turn called into question the usefulness of our models. Consequently, we may ask if economists should be paying more attention to recent advances in data analytics and artificial intelligence (AI) technologies to improve our forecasts and models. Nobel Laureate Michael Spence, for example, suggested that AI can understand the complex supply chain systems that would be difficult for humans to fully comprehend, and with this understanding, disruptions during the pandemic may have been tempered.See 84th Kale Memorial Lecture at Gokhale Institute delivered by Spence (2024).

7   To be sure, “traditional” big data and machine learning techniques are already widely used in economics and finance. The Billion Prices Project started at Massachusetts Institute of Technology (MIT) more than a decade ago was an early adopter, pioneering the collection of online product prices from a large number of retailers at a daily frequency, in order to improve the measurement of aggregate price data.See Cavallo and Rigobon (2016). Since then, rising computing power and data availability have led to a surge in research on the economic applications of AI and machine learning (AI/ML). Central banks have also contributed in this regard, adopting AI/ML in areas ranging from financial supervision to macroeconomic surveillance. To cite a few examples, AI/ML techniques have been used to identify anomalous financial transactionsThe Central Bank of Ecuador explored the use of artificial neural networks to detect unusual payment patterns in Chile’s interbank payments network using intraday transactions data. See Rubio et al. (2021)., help supervisors sift through large volumes of text data submitted by financial institutions to identify vulnerable areasExamples of systems built on LLMs that aggregate large volumes of information and make it more accessible to support financial supervision include the Athena platform at the ECB and the natural language tool LEX at the Federal Reserve System. See online appendix in Araujo et al. (2024), ECB (2023) and Beerman, Prenio and Zamil (2021). and generate dynamic measures of inflation expectations using social media posts.See online appendix in Araujo et al. (2024) and Denes, Lestrade and Richardet (2021).

8   A key strength of AI/ML modelling approaches in predictive tasks is their ability to let the data flexibly determine the functional form of the model.Athey (2018) reviews the applications of machine learning techniques in economics. This potentially allows AI/ML models to capture non-linearities in economic dynamics in a way that mimics expert (human) judgement. Recent advances in generative AI (GenAI) take this even further. State-of-the-art large language models (LLMs) trained on vast amounts of data can generate alternate scenarios, specify and simulate basic economic modelsKorinek (2023) shows that LLMs are capable of setting up model equations that are commonly used in pedagogical settings and have limited ability to derive equations (e.g., first-order conditions in optimisation problems). and beat experts at forecasting inflation.Faria e Castro and Leibovici (2024) find that in-sample conditional inflation forecasts produced using Google’s PaLM have a lower mean-squared error than those from the Survey of Professional Forecasters for most of the sample that they consider and at almost all projection horizons.

9   However, the flexibility of this class of models is also a drawback: AI/ML models can be “fragile” in that their output is often highly sensitive to the choice of model parameters or prompts provided. Together with their opacity, this flaw makes it difficult to parse the underlying drivers of the process being modelled.Unlike traditional “glass box” models, Kumar et al. (2023) find that explanations for the predictions of deep learning models can vary between models with subtle differences in settings, and even where the differences are due to arbitrary factors like random seeds. Despite their impressive capabilities, current LLMs struggle with logic puzzles and mathematical operationsKorinek (2023) documents how LLMs routinely make mistakes in mathematical derivations. Perez-Cruz and Shin (2024) present the GPT-4 LLM with a well-known logic puzzle and demonstrate that the model is unable to produce the correct answer in experiments when the original wording of the puzzle was changed., suggesting that they are not yet capable of providing credible explanations for their own predictions.

Future directions

10   On the whole, AI models currently lack the “clarity of structure” that makes structural models useful to policymakers.See Blanchard (2016). To quote former Fed Governor Laurence Meyer, model-based forecasting needs to “start with a paradigm and end with a story”.See speech by Meyer (1998). Without the ability to articulate a vision of the how the economy works or discriminate between competing narratives, AI models cannot yet replace structural models at central banks.

11   AI or GenAI are not yet the general purpose technology or GPT envisaged by its enthusiastic proponents. For now, perhaps the best way to incorporate AI techniques into central bank modelling toolkits is to use them in satellite models that complement core structural models. Beyond using AI techniques independently for forecasting tasks, this could extend to “semi-structural” approaches connecting AI methods to economic theory. Promising applications in recent research include the use of deep learning models to estimate economic relationships, such as the Phillips Curve, that underpin standard macroeconomic models.The neural Phillips curve developed by Buckmann, Potjagailo and Schnattinger (2023) to decompose UK services inflation into its underlying drivers using a neural network is an example of augmenting traditional macroeconomic analysis with AI. Coletti (2023) also highlights the potential applications for “data-rich” techniques to directly improve the estimation of standard macroeconomic models, e.g., through the joint estimation of parameters by exploiting variation in a large number of variables. Our current models have been built up by rigorously incorporating the most relevant new developments, while retaining their core theoretical foundations. As we improve our understanding of the mechanics underlying AI techniques, we could begin to bring them into our workhorse models in a similar way.

12   We do need to prepare for the day when GenAI evolves as a GPT.Here, it is useful to be reminded of Paul David’s influential hypothesis about the impact of electric machinery in the late-1800s. As Robert Gordon (2016) interprets the work, David was in effect saying “just wait” – almost four decades transpired between Edison’s opening in 1882 of the Pearl Street power plant in Lower Manhattan and the subsequent upsurge in productivity growth in the early 1920s associated with the electrification of manufacturing. There is significant investment needed for the new technology to become a Pareto superior reality and as we have been warned, this may need some enlightened intervention upstream to influence its transitional path. In the area of economic research as well as more broadly, we need to discretionarily ensure GenAI is an impetus for Harrod-neutral technological progress.

13   Indeed, the path ahead for economic modelling is an exciting one. Ongoing shifts in the global economy are throwing up new questions for our models, and the techniques we can bring to bear to answer them grow ever richer. As central bankers and researchers, many of us will be experiencing this evolution firsthand. This workshop represents a great opportunity for us to grapple with common issues among like-minded peers.


1.   Araujo, Douglas Kiarelly Godoy, Doerr, Sebastian, Gambacorta, Leonardo and Tissot, Bruno (2024), “Artificial Intelligence in Central Banking”, BIS Bulletin No. 84.
2.   Aruoba, S. Boragan, Cuba-Borda, Pablo and Schorfheide, Frank (2018), “Macroeconomic Dynamics Near the ZLB: A Tale of Two Countries”, The Review of Economic Studies, Vol. 85(1), pp. 87–118.
3.   Athey, Susan (2018), “The Impact of Machine Learning on Economics” in Agrawal, Ajay, Gans, Joshua and Goldfarb, Avi (eds.), The Economics of Artificial Intelligence: An Agenda, University of Chicago Press, pp. 507–547.
4.   Beerman, Kenton, Prenio, Jermy and Zamil, Raihan (2021), “Suptech tools for prudential supervision and their use during the pandemic”, FSI Insights No. 37.
5.   Blanchard, Olivier (2016), “Do DSGE Models Have a Future?”, Peterson Institute for International Economics Policy Briefs PB16-11.
6.   Buckmann, Marcus, Potjagailo, Galina and Schnattinger, Philip (2023), “Dissecting UK service inflation via a neural network Phillips curve”, Bank Underground, 10 July 2023. https://bankunderground.co.uk/2023/07/10/dissecting-uk-service-inflation-via-a-neural-network-phillips-curve/
7.   Cavallo, Alberto and Rigobon, Roberto (2016), “The Billion Prices Project: Using Online Prices for Measurement and Research”, Journal of Economic Perspectives, Vol. 30(2), pp. 151–178.
8.   Christiano, Lawrence J., Eichenbaum, Martin and Evans, Charles L. (2005), “Nominal Rigidities and the Dynamic Effects of a Shock to Monetary Policy”, Journal of Political Economy, Vol. 113(1), pp. 1–45.
9.   Christiano, Lawrence J., Eichenbaum, Martin and Trabandt, Mathias (2018), “On DSGE Models”, Journal of Economic Perspectives, Vol. 32(3), pp. 87–112.
10.   Christiano, Lawrence J., Eichenbaum, Martin and Trabandt, Mathias (2015), “Understanding the Great Recession”, American Economic Journal: Macroeconomics, Vol. 7(1), pp. 110–167.
11.   Coletti, Don (2023), “A Blueprint for the Fourth Generation of Bank of Canada Projection and Policy Analysis Models”, Bank of Canada Staff Discussion Paper 2023-23.
12.   Denes, Julien, Lestrade, Ariane and Richardet, Lou (2022), “Using twitter data to measure inflation perception”, IFC Bulletin No. 57.
13.   De Vroey, Michel (2016), A History of Macroeconomics: From Keynes to Lucas and Beyond, Cambridge University Press.
14.   ECB (2023), “Suptech: thriving in the digital age”, Supervision Newsletter, 15 November 2023.
15.   Eichenbaum, Martin, Rebelo, Sergio and Trabandt, Mathis (2020), “Epidemics in the New Keynesian Model”, NBER Working Paper No. 27430.
16.   Faria e Castro, Miguel and Leibovici, Fernando (2024), “Artificial Intelligence and Inflation Forecasts”, Federal Reserve Bank of St. Louis Working Paper 2023-015.
17.   Galí, Jordi (2018), “The State of New Keynesian Economics: A Partial Assessment”, Journal of Economic Perspectives, Vol. 32(3), pp. 113–140.
18.   Gertler, Mark and Kiyotaki, Nobuhiro (2015), “Banking, Liquidity and Bank Runs in an Infinite Horizon Economy”, American Economic Review, Vol. 105(7), pp. 2011–2043.
19.   Korinek, Anton (2023), “Generative AI for Economic Research: Use Cases and Implications for Economists”, Journal of Economic Literature, Vol. 61(4), pp. 1281–1317.
20.   Kumar, Rishabh, Koshiyama, Adriano, da Costa, Kleyton, Kingsman, Nigel, Tewarrie, Marvin, Kazim, Emre, Roy, Arunita, Treleaven, Philip and Lovell, Zac (2023), “Deep learning model fragility and implications for financial stability and regulation”, Bank of England Staff Working Paper No. 1038.
21.   Meyer, Laurence H. (1998), “Start with a Paradigm, End with a Story”, remarks presented to Downtown Economics Club, 50th Anniversary Dinner, New York, 3 June 1998.
22.   Perez-Cruz, Fernando and Song Shin, Hyun (2024), “Testing the cognitive limits of large language models”, BIS Bulletin No. 83.
23.   Gordon, R, J. (2016), The Rise and Fall of Economic Growth, Princeton University Press.
24.   Rubio, Jeniffer, Barucca, Paolo, Gage, Luis Gerado, Arroyo, John and Morales-Resendiz, Raúl (2021), “Classifying payment patterns with artificial neural networks: an autoencoder approach”, IFC Bulletin No. 57.
25.   Spence, Michael (2024), [84th Kale Memorial Lecture delivered at the Gokhale Institute], 16 February 2024.